├── CODE_OF_CONDUCT.md ├── LICENSE ├── README.md ├── android ├── .gitignore ├── app │ ├── .gitignore │ ├── build.gradle │ ├── download_model.gradle │ ├── proguard-rules.pro │ └── src │ │ ├── androidTest │ │ ├── assets │ │ │ ├── table.jpg │ │ │ └── table_results.txt │ │ └── java │ │ │ ├── AndroidManifest.xml │ │ │ └── org │ │ │ └── tensorflow │ │ │ └── lite │ │ │ └── examples │ │ │ └── detection │ │ │ └── DetectorTest.java │ │ └── main │ │ ├── AndroidManifest.xml │ │ ├── assets │ │ ├── coco.txt │ │ ├── kite.jpg │ │ ├── labelmap.txt │ │ └── yolov4-416-fp32.tflite │ │ ├── java │ │ └── org │ │ │ └── tensorflow │ │ │ └── lite │ │ │ └── examples │ │ │ └── detection │ │ │ ├── CameraActivity.java │ │ │ ├── CameraConnectionFragment.java │ │ │ ├── DetectorActivity.java │ │ │ ├── LegacyCameraConnectionFragment.java │ │ │ ├── MainActivity.java │ │ │ ├── customview │ │ │ ├── AutoFitTextureView.java │ │ │ ├── OverlayView.java │ │ │ ├── RecognitionScoreView.java │ │ │ └── ResultsView.java │ │ │ ├── env │ │ │ ├── BorderedText.java │ │ │ ├── ImageUtils.java │ │ │ ├── Logger.java │ │ │ ├── Size.java │ │ │ └── Utils.java │ │ │ ├── tflite │ │ │ ├── Classifier.java │ │ │ └── YoloV4Classifier.java │ │ │ └── tracking │ │ │ └── MultiBoxTracker.java │ │ └── res │ │ ├── drawable-hdpi │ │ └── ic_launcher.png │ │ ├── drawable-mdpi │ │ └── ic_launcher.png │ │ ├── drawable-v24 │ │ ├── ic_launcher_foreground.xml │ │ └── kite.jpg │ │ ├── drawable-xxhdpi │ │ ├── ic_launcher.png │ │ ├── icn_chevron_down.png │ │ ├── icn_chevron_up.png │ │ ├── tfl2_logo.png │ │ └── tfl2_logo_dark.png │ │ ├── drawable-xxxhdpi │ │ ├── caret.jpg │ │ ├── chair.jpg │ │ └── sample_image.jpg │ │ ├── drawable │ │ ├── bottom_sheet_bg.xml │ │ ├── ic_baseline_add.xml │ │ ├── ic_baseline_remove.xml │ │ ├── ic_launcher_background.xml │ │ └── rectangle.xml │ │ ├── layout │ │ ├── activity_main.xml │ │ ├── tfe_od_activity_camera.xml │ │ ├── tfe_od_camera_connection_fragment_tracking.xml │ │ └── tfe_od_layout_bottom_sheet.xml │ │ ├── mipmap-anydpi-v26 │ │ ├── ic_launcher.xml │ │ └── ic_launcher_round.xml │ │ ├── mipmap-hdpi │ │ ├── ic_launcher.png │ │ ├── ic_launcher_foreground.png │ │ └── ic_launcher_round.png │ │ ├── mipmap-mdpi │ │ ├── ic_launcher.png │ │ ├── ic_launcher_foreground.png │ │ └── ic_launcher_round.png │ │ ├── mipmap-xhdpi │ │ ├── ic_launcher.png │ │ ├── ic_launcher_foreground.png │ │ └── ic_launcher_round.png │ │ ├── mipmap-xxhdpi │ │ ├── ic_launcher.png │ │ ├── ic_launcher_foreground.png │ │ └── ic_launcher_round.png │ │ ├── mipmap-xxxhdpi │ │ ├── ic_launcher.png │ │ ├── ic_launcher_foreground.png │ │ └── ic_launcher_round.png │ │ └── values │ │ ├── colors.xml │ │ ├── dimens.xml │ │ ├── strings.xml │ │ └── styles.xml ├── build.gradle ├── gradle.properties ├── gradle │ └── wrapper │ │ ├── gradle-wrapper.jar │ │ └── gradle-wrapper.properties ├── gradlew ├── gradlew.bat └── settings.gradle ├── benchmarks.py ├── convert_tflite.py ├── convert_trt.py ├── core ├── backbone.py ├── common.py ├── config.py ├── dataset.py ├── utils.py └── yolov4.py ├── data ├── anchors │ ├── basline_anchors.txt │ ├── basline_tiny_anchors.txt │ ├── yolov3_anchors.txt │ └── yolov4_anchors.txt ├── classes │ ├── coco.names │ ├── voc.names │ └── yymnist.names ├── dataset │ ├── val2014.txt │ └── val2017.txt ├── girl.png ├── kite.jpg ├── performance.png └── road.mp4 ├── detect.py ├── detectvideo.py ├── evaluate.py ├── mAP ├── extra │ ├── intersect-gt-and-pred.py │ └── remove_space.py └── main.py ├── requirements-gpu.txt ├── requirements.txt ├── result-int8.png ├── result.png ├── save_model.py ├── scripts ├── coco_annotation.py ├── coco_convert.py ├── get_coco_dataset_2017.sh ├── google_utils.py ├── voc │ ├── README.md │ ├── get_voc2012.sh │ ├── voc_convert.py │ └── voc_make_names.py └── voc_annotation.py └── train.py /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | In the interest of fostering an open and welcoming environment, we as 6 | contributors and maintainers pledge to making participation in our project and 7 | our community a harassment-free experience for everyone, regardless of age, body 8 | size, disability, ethnicity, sex characteristics, gender identity and expression, 9 | level of experience, education, socio-economic status, nationality, personal 10 | appearance, race, religion, or sexual identity and orientation. 11 | 12 | ## Our Standards 13 | 14 | Examples of behavior that contributes to creating a positive environment 15 | include: 16 | 17 | * Using welcoming and inclusive language 18 | * Being respectful of differing viewpoints and experiences 19 | * Gracefully accepting constructive criticism 20 | * Focusing on what is best for the community 21 | * Showing empathy towards other community members 22 | 23 | Examples of unacceptable behavior by participants include: 24 | 25 | * The use of sexualized language or imagery and unwelcome sexual attention or 26 | advances 27 | * Trolling, insulting/derogatory comments, and personal or political attacks 28 | * Public or private harassment 29 | * Publishing others' private information, such as a physical or electronic 30 | address, without explicit permission 31 | * Other conduct which could reasonably be considered inappropriate in a 32 | professional setting 33 | 34 | ## Our Responsibilities 35 | 36 | Project maintainers are responsible for clarifying the standards of acceptable 37 | behavior and are expected to take appropriate and fair corrective action in 38 | response to any instances of unacceptable behavior. 39 | 40 | Project maintainers have the right and responsibility to remove, edit, or 41 | reject comments, commits, code, wiki edits, issues, and other contributions 42 | that are not aligned to this Code of Conduct, or to ban temporarily or 43 | permanently any contributor for other behaviors that they deem inappropriate, 44 | threatening, offensive, or harmful. 45 | 46 | ## Scope 47 | 48 | This Code of Conduct applies both within project spaces and in public spaces 49 | when an individual is representing the project or its community. Examples of 50 | representing a project or community include using an official project e-mail 51 | address, posting via an official social media account, or acting as an appointed 52 | representative at an online or offline event. Representation of a project may be 53 | further defined and clarified by project maintainers. 54 | 55 | ## Enforcement 56 | 57 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 58 | reported by contacting the project team at hunglc007@gmail.com. All 59 | complaints will be reviewed and investigated and will result in a response that 60 | is deemed necessary and appropriate to the circumstances. The project team is 61 | obligated to maintain confidentiality with regard to the reporter of an incident. 62 | Further details of specific enforcement policies may be posted separately. 63 | 64 | Project maintainers who do not follow or enforce the Code of Conduct in good 65 | faith may face temporary or permanent repercussions as determined by other 66 | members of the project's leadership. 67 | 68 | ## Attribution 69 | 70 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, 71 | available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html 72 | 73 | [homepage]: https://www.contributor-covenant.org 74 | 75 | For answers to common questions about this code of conduct, see 76 | https://www.contributor-covenant.org/faq 77 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Việt Hùng 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # tensorflow-yolov4-tflite 2 | [![license](https://img.shields.io/github/license/mashape/apistatus.svg)](LICENSE) 3 | 4 | YOLOv4, YOLOv4-tiny Implemented in Tensorflow 2.0. 5 | Convert YOLO v4, YOLOv3, YOLO tiny .weights to .pb, .tflite and trt format for tensorflow, tensorflow lite, tensorRT. 6 | 7 | Download yolov4.weights file: https://drive.google.com/open?id=1cewMfusmPjYWbrnuJRuKhPMwRe_b9PaT 8 | 9 | 10 | ### Prerequisites 11 | * Tensorflow 2.3.0rc0 12 | 13 | ### Performance 14 |

15 | 16 | ### Demo 17 | 18 | ```bash 19 | # Convert darknet weights to tensorflow 20 | ## yolov4 21 | python save_model.py --weights ./data/yolov4.weights --output ./checkpoints/yolov4-416 --input_size 416 --model yolov4 22 | 23 | ## yolov4-tiny 24 | python save_model.py --weights ./data/yolov4-tiny.weights --output ./checkpoints/yolov4-tiny-416 --input_size 416 --model yolov4 --tiny 25 | 26 | # Run demo tensorflow 27 | python detect.py --weights ./checkpoints/yolov4-416 --size 416 --model yolov4 --image ./data/kite.jpg 28 | 29 | python detect.py --weights ./checkpoints/yolov4-tiny-416 --size 416 --model yolov4 --image ./data/kite.jpg --tiny 30 | 31 | ``` 32 | If you want to run yolov3 or yolov3-tiny change ``--model yolov3`` in command 33 | 34 | #### Output 35 | 36 | ##### Yolov4 original weight 37 |

38 | 39 | ##### Yolov4 tflite int8 40 |

41 | 42 | ### Convert to tflite 43 | 44 | ```bash 45 | # Save tf model for tflite converting 46 | python save_model.py --weights ./data/yolov4.weights --output ./checkpoints/yolov4-416 --input_size 416 --model yolov4 --framework tflite 47 | 48 | # yolov4 49 | python convert_tflite.py --weights ./checkpoints/yolov4-416 --output ./checkpoints/yolov4-416.tflite 50 | 51 | # yolov4 quantize float16 52 | python convert_tflite.py --weights ./checkpoints/yolov4-416 --output ./checkpoints/yolov4-416-fp16.tflite --quantize_mode float16 53 | 54 | # yolov4 quantize int8 55 | python convert_tflite.py --weights ./checkpoints/yolov4-416 --output ./checkpoints/yolov4-416-int8.tflite --quantize_mode int8 --dataset ./coco_dataset/coco/val207.txt 56 | 57 | # Run demo tflite model 58 | python detect.py --weights ./checkpoints/yolov4-416.tflite --size 416 --model yolov4 --image ./data/kite.jpg --framework tflite 59 | ``` 60 | Yolov4 and Yolov4-tiny int8 quantization have some issues. I will try to fix that. You can try Yolov3 and Yolov3-tiny int8 quantization 61 | ### Convert to TensorRT 62 | ```bash# yolov3 63 | python save_model.py --weights ./data/yolov3.weights --output ./checkpoints/yolov3.tf --input_size 416 --model yolov3 64 | python convert_trt.py --weights ./checkpoints/yolov3.tf --quantize_mode float16 --output ./checkpoints/yolov3-trt-fp16-416 65 | 66 | # yolov3-tiny 67 | python save_model.py --weights ./data/yolov3-tiny.weights --output ./checkpoints/yolov3-tiny.tf --input_size 416 --tiny 68 | python convert_trt.py --weights ./checkpoints/yolov3-tiny.tf --quantize_mode float16 --output ./checkpoints/yolov3-tiny-trt-fp16-416 69 | 70 | # yolov4 71 | python save_model.py --weights ./data/yolov4.weights --output ./checkpoints/yolov4.tf --input_size 416 --model yolov4 72 | python convert_trt.py --weights ./checkpoints/yolov4.tf --quantize_mode float16 --output ./checkpoints/yolov4-trt-fp16-416 73 | ``` 74 | 75 | ### Evaluate on COCO 2017 Dataset 76 | ```bash 77 | # run script in /script/get_coco_dataset_2017.sh to download COCO 2017 Dataset 78 | # preprocess coco dataset 79 | cd data 80 | mkdir dataset 81 | cd .. 82 | cd scripts 83 | python coco_convert.py --input ./coco/annotations/instances_val2017.json --output val2017.pkl 84 | python coco_annotation.py --coco_path ./coco 85 | cd .. 86 | 87 | # evaluate yolov4 model 88 | python evaluate.py --weights ./data/yolov4.weights 89 | cd mAP/extra 90 | python remove_space.py 91 | cd .. 92 | python main.py --output results_yolov4_tf 93 | ``` 94 | #### mAP50 on COCO 2017 Dataset 95 | 96 | | Detection | 512x512 | 416x416 | 320x320 | 97 | |-------------|---------|---------|---------| 98 | | YoloV3 | 55.43 | 52.32 | | 99 | | YoloV4 | 61.96 | 57.33 | | 100 | 101 | ### Benchmark 102 | ```bash 103 | python benchmarks.py --size 416 --model yolov4 --weights ./data/yolov4.weights 104 | ``` 105 | #### TensorRT performance 106 | 107 | | YoloV4 416 images/s | FP32 | FP16 | INT8 | 108 | |---------------------|----------|----------|----------| 109 | | Batch size 1 | 55 | 116 | | 110 | | Batch size 8 | 70 | 152 | | 111 | 112 | #### Tesla P100 113 | 114 | | Detection | 512x512 | 416x416 | 320x320 | 115 | |-------------|---------|---------|---------| 116 | | YoloV3 FPS | 40.6 | 49.4 | 61.3 | 117 | | YoloV4 FPS | 33.4 | 41.7 | 50.0 | 118 | 119 | #### Tesla K80 120 | 121 | | Detection | 512x512 | 416x416 | 320x320 | 122 | |-------------|---------|---------|---------| 123 | | YoloV3 FPS | 10.8 | 12.9 | 17.6 | 124 | | YoloV4 FPS | 9.6 | 11.7 | 16.0 | 125 | 126 | #### Tesla T4 127 | 128 | | Detection | 512x512 | 416x416 | 320x320 | 129 | |-------------|---------|---------|---------| 130 | | YoloV3 FPS | 27.6 | 32.3 | 45.1 | 131 | | YoloV4 FPS | 24.0 | 30.3 | 40.1 | 132 | 133 | #### Tesla P4 134 | 135 | | Detection | 512x512 | 416x416 | 320x320 | 136 | |-------------|---------|---------|---------| 137 | | YoloV3 FPS | 20.2 | 24.2 | 31.2 | 138 | | YoloV4 FPS | 16.2 | 20.2 | 26.5 | 139 | 140 | #### Macbook Pro 15 (2.3GHz i7) 141 | 142 | | Detection | 512x512 | 416x416 | 320x320 | 143 | |-------------|---------|---------|---------| 144 | | YoloV3 FPS | | | | 145 | | YoloV4 FPS | | | | 146 | 147 | ### Traning your own model 148 | ```bash 149 | # Prepare your dataset 150 | # If you want to train from scratch: 151 | In config.py set FISRT_STAGE_EPOCHS=0 152 | # Run script: 153 | python train.py 154 | 155 | # Transfer learning: 156 | python train.py --weights ./data/yolov4.weights 157 | ``` 158 | The training performance is not fully reproduced yet, so I recommended to use Alex's [Darknet](https://github.com/AlexeyAB/darknet) to train your own data, then convert the .weights to tensorflow or tflite. 159 | 160 | 161 | 162 | ### TODO 163 | * [x] Convert YOLOv4 to TensorRT 164 | * [x] YOLOv4 tflite on android 165 | * [ ] YOLOv4 tflite on ios 166 | * [x] Training code 167 | * [x] Update scale xy 168 | * [ ] ciou 169 | * [ ] Mosaic data augmentation 170 | * [x] Mish activation 171 | * [x] yolov4 tflite version 172 | * [x] yolov4 in8 tflite version for mobile 173 | 174 | ### References 175 | 176 | * YOLOv4: Optimal Speed and Accuracy of Object Detection [YOLOv4](https://arxiv.org/abs/2004.10934). 177 | * [darknet](https://github.com/AlexeyAB/darknet) 178 | 179 | My project is inspired by these previous fantastic YOLOv3 implementations: 180 | * [Yolov3 tensorflow](https://github.com/YunYang1994/tensorflow-yolov3) 181 | * [Yolov3 tf2](https://github.com/zzh8829/yolov3-tf2) 182 | -------------------------------------------------------------------------------- /android/.gitignore: -------------------------------------------------------------------------------- 1 | *.iml 2 | .gradle 3 | /local.properties 4 | /.idea/libraries 5 | /.idea/modules.xml 6 | /.idea/workspace.xml 7 | .DS_Store 8 | /build 9 | /captures 10 | .externalNativeBuild 11 | 12 | /.gradle/ 13 | /.idea/ 14 | -------------------------------------------------------------------------------- /android/app/.gitignore: -------------------------------------------------------------------------------- 1 | /build 2 | /build/ -------------------------------------------------------------------------------- /android/app/build.gradle: -------------------------------------------------------------------------------- 1 | apply plugin: 'com.android.application' 2 | apply plugin: 'de.undercouch.download' 3 | 4 | android { 5 | compileSdkVersion 28 6 | buildToolsVersion '28.0.3' 7 | defaultConfig { 8 | applicationId "org.tensorflow.lite.examples.detection" 9 | minSdkVersion 21 10 | targetSdkVersion 28 11 | versionCode 1 12 | versionName "1.0" 13 | 14 | // ndk { 15 | // abiFilters 'armeabi-v7a', 'arm64-v8a' 16 | // } 17 | } 18 | buildTypes { 19 | release { 20 | minifyEnabled false 21 | proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro' 22 | } 23 | } 24 | aaptOptions { 25 | noCompress "tflite" 26 | } 27 | compileOptions { 28 | sourceCompatibility = '1.8' 29 | targetCompatibility = '1.8' 30 | } 31 | lintOptions { 32 | abortOnError false 33 | } 34 | } 35 | 36 | // import DownloadModels task 37 | project.ext.ASSET_DIR = projectDir.toString() + '/src/main/assets' 38 | project.ext.TMP_DIR = project.buildDir.toString() + '/downloads' 39 | 40 | // Download default models; if you wish to use your own models then 41 | // place them in the "assets" directory and comment out this line. 42 | //apply from: "download_model.gradle" 43 | 44 | apply from: 'download_model.gradle' 45 | 46 | dependencies { 47 | implementation fileTree(dir: 'libs', include: ['*.jar', '*.aar']) 48 | implementation 'androidx.appcompat:appcompat:1.1.0' 49 | implementation 'androidx.coordinatorlayout:coordinatorlayout:1.1.0' 50 | implementation 'com.google.android.material:material:1.1.0' 51 | // implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly' 52 | // implementation 'org.tensorflow:tensorflow-lite-gpu:0.0.0-nightly' 53 | implementation 'org.tensorflow:tensorflow-lite:2.2.0' 54 | implementation 'org.tensorflow:tensorflow-lite-gpu:2.2.0' 55 | // implementation 'org.tensorflow:tensorflow-lite:0.0.0-gpu-experimental' 56 | implementation 'androidx.constraintlayout:constraintlayout:1.1.3' 57 | implementation 'com.google.code.gson:gson:2.8.6' 58 | androidTestImplementation 'androidx.test.ext:junit:1.1.1' 59 | androidTestImplementation 'com.android.support.test:rules:1.0.2' 60 | androidTestImplementation 'com.google.truth:truth:1.0.1' 61 | } 62 | -------------------------------------------------------------------------------- /android/app/download_model.gradle: -------------------------------------------------------------------------------- 1 | 2 | task downloadZipFile(type: Download) { 3 | src 'http://storage.googleapis.com/download.tensorflow.org/models/tflite/coco_ssd_mobilenet_v1_1.0_quant_2018_06_29.zip' 4 | dest new File(buildDir, 'zips/') 5 | overwrite false 6 | } 7 | 8 | 9 | task downloadAndUnzipFile(dependsOn: downloadZipFile, type: Copy) { 10 | from zipTree(downloadZipFile.dest) 11 | into project.ext.ASSET_DIR 12 | } 13 | 14 | 15 | task extractModels(type: Copy) { 16 | dependsOn downloadAndUnzipFile 17 | } 18 | 19 | tasks.whenTaskAdded { task -> 20 | if (task.name == 'assembleDebug') { 21 | task.dependsOn 'extractModels' 22 | } 23 | if (task.name == 'assembleRelease') { 24 | task.dependsOn 'extractModels' 25 | } 26 | } -------------------------------------------------------------------------------- /android/app/proguard-rules.pro: -------------------------------------------------------------------------------- 1 | # Add project specific ProGuard rules here. 2 | # You can control the set of applied configuration files using the 3 | # proguardFiles setting in build.gradle. 4 | # 5 | # For more details, see 6 | # http://developer.android.com/guide/developing/tools/proguard.html 7 | 8 | # If your project uses WebView with JS, uncomment the following 9 | # and specify the fully qualified class name to the JavaScript interface 10 | # class: 11 | #-keepclassmembers class fqcn.of.javascript.interface.for.webview { 12 | # public *; 13 | #} 14 | 15 | # Uncomment this to preserve the line number information for 16 | # debugging stack traces. 17 | #-keepattributes SourceFile,LineNumberTable 18 | 19 | # If you keep the line number information, uncomment this to 20 | # hide the original source file name. 21 | #-renamesourcefileattribute SourceFile 22 | -------------------------------------------------------------------------------- /android/app/src/androidTest/assets/table.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/androidTest/assets/table.jpg -------------------------------------------------------------------------------- /android/app/src/androidTest/assets/table_results.txt: -------------------------------------------------------------------------------- 1 | dining_table 27.492085 97.94615 623.1435 444.8627 0.48828125 2 | knife 342.53433 243.71082 583.89185 416.34595 0.4765625 3 | cup 68.025925 197.5857 202.02031 374.2206 0.4375 4 | book 185.43098 139.64153 244.51149 203.37737 0.3125 5 | -------------------------------------------------------------------------------- /android/app/src/androidTest/java/AndroidManifest.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 5 | -------------------------------------------------------------------------------- /android/app/src/androidTest/java/org/tensorflow/lite/examples/detection/DetectorTest.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2020 The TensorFlow Authors. All Rights Reserved. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package org.tensorflow.lite.examples.detection; 18 | 19 | import static com.google.common.truth.Truth.assertThat; 20 | import static java.lang.Math.abs; 21 | import static java.lang.Math.max; 22 | import static java.lang.Math.min; 23 | 24 | import android.content.res.AssetManager; 25 | import android.graphics.Bitmap; 26 | import android.graphics.Bitmap.Config; 27 | import android.graphics.BitmapFactory; 28 | import android.graphics.Canvas; 29 | import android.graphics.Matrix; 30 | import android.graphics.RectF; 31 | import android.util.Size; 32 | import androidx.test.ext.junit.runners.AndroidJUnit4; 33 | import androidx.test.platform.app.InstrumentationRegistry; 34 | import java.io.IOException; 35 | import java.io.InputStream; 36 | import java.util.ArrayList; 37 | import java.util.List; 38 | import java.util.Scanner; 39 | import org.junit.Before; 40 | import org.junit.Test; 41 | import org.junit.runner.RunWith; 42 | import org.tensorflow.lite.examples.detection.env.ImageUtils; 43 | import org.tensorflow.lite.examples.detection.tflite.Classifier; 44 | import org.tensorflow.lite.examples.detection.tflite.Classifier.Recognition; 45 | import org.tensorflow.lite.examples.detection.tflite.TFLiteObjectDetectionAPIModel; 46 | 47 | /** Golden test for Object Detection Reference app. */ 48 | @RunWith(AndroidJUnit4.class) 49 | public class DetectorTest { 50 | 51 | private static final int MODEL_INPUT_SIZE = 300; 52 | private static final boolean IS_MODEL_QUANTIZED = true; 53 | private static final String MODEL_FILE = "detect.tflite"; 54 | private static final String LABELS_FILE = "file:///android_asset/labelmap.txt"; 55 | private static final Size IMAGE_SIZE = new Size(640, 480); 56 | 57 | private Classifier detector; 58 | private Bitmap croppedBitmap; 59 | private Matrix frameToCropTransform; 60 | private Matrix cropToFrameTransform; 61 | 62 | @Before 63 | public void setUp() throws IOException { 64 | AssetManager assetManager = 65 | InstrumentationRegistry.getInstrumentation().getContext().getAssets(); 66 | detector = 67 | TFLiteObjectDetectionAPIModel.create( 68 | assetManager, 69 | MODEL_FILE, 70 | LABELS_FILE, 71 | MODEL_INPUT_SIZE, 72 | IS_MODEL_QUANTIZED); 73 | int cropSize = MODEL_INPUT_SIZE; 74 | int previewWidth = IMAGE_SIZE.getWidth(); 75 | int previewHeight = IMAGE_SIZE.getHeight(); 76 | int sensorOrientation = 0; 77 | croppedBitmap = Bitmap.createBitmap(cropSize, cropSize, Config.ARGB_8888); 78 | 79 | frameToCropTransform = 80 | ImageUtils.getTransformationMatrix( 81 | previewWidth, previewHeight, 82 | cropSize, cropSize, 83 | sensorOrientation, false); 84 | cropToFrameTransform = new Matrix(); 85 | frameToCropTransform.invert(cropToFrameTransform); 86 | } 87 | 88 | @Test 89 | public void detectionResultsShouldNotChange() throws Exception { 90 | Canvas canvas = new Canvas(croppedBitmap); 91 | canvas.drawBitmap(loadImage("table.jpg"), frameToCropTransform, null); 92 | final List results = detector.recognizeImage(croppedBitmap); 93 | final List expected = loadRecognitions("table_results.txt"); 94 | 95 | for (Recognition target : expected) { 96 | // Find a matching result in results 97 | boolean matched = false; 98 | for (Recognition item : results) { 99 | RectF bbox = new RectF(); 100 | cropToFrameTransform.mapRect(bbox, item.getLocation()); 101 | if (item.getTitle().equals(target.getTitle()) 102 | && matchBoundingBoxes(bbox, target.getLocation()) 103 | && matchConfidence(item.getConfidence(), target.getConfidence())) { 104 | matched = true; 105 | break; 106 | } 107 | } 108 | assertThat(matched).isTrue(); 109 | } 110 | } 111 | 112 | // Confidence tolerance: absolute 1% 113 | private static boolean matchConfidence(float a, float b) { 114 | return abs(a - b) < 0.01; 115 | } 116 | 117 | // Bounding Box tolerance: overlapped area > 95% of each one 118 | private static boolean matchBoundingBoxes(RectF a, RectF b) { 119 | float areaA = a.width() * a.height(); 120 | float areaB = b.width() * b.height(); 121 | RectF overlapped = 122 | new RectF( 123 | max(a.left, b.left), max(a.top, b.top), min(a.right, b.right), min(a.bottom, b.bottom)); 124 | float overlappedArea = overlapped.width() * overlapped.height(); 125 | return overlappedArea > 0.95 * areaA && overlappedArea > 0.95 * areaB; 126 | } 127 | 128 | private static Bitmap loadImage(String fileName) throws Exception { 129 | AssetManager assetManager = 130 | InstrumentationRegistry.getInstrumentation().getContext().getAssets(); 131 | InputStream inputStream = assetManager.open(fileName); 132 | return BitmapFactory.decodeStream(inputStream); 133 | } 134 | 135 | // The format of result: 136 | // category bbox.left bbox.top bbox.right bbox.bottom confidence 137 | // ... 138 | // Example: 139 | // Apple 99 25 30 75 80 0.99 140 | // Banana 25 90 75 200 0.98 141 | // ... 142 | private static List loadRecognitions(String fileName) throws Exception { 143 | AssetManager assetManager = 144 | InstrumentationRegistry.getInstrumentation().getContext().getAssets(); 145 | InputStream inputStream = assetManager.open(fileName); 146 | Scanner scanner = new Scanner(inputStream); 147 | List result = new ArrayList<>(); 148 | while (scanner.hasNext()) { 149 | String category = scanner.next(); 150 | category = category.replace('_', ' '); 151 | if (!scanner.hasNextFloat()) { 152 | break; 153 | } 154 | float left = scanner.nextFloat(); 155 | float top = scanner.nextFloat(); 156 | float right = scanner.nextFloat(); 157 | float bottom = scanner.nextFloat(); 158 | RectF boundingBox = new RectF(left, top, right, bottom); 159 | float confidence = scanner.nextFloat(); 160 | Recognition recognition = new Recognition(null, category, confidence, boundingBox); 161 | result.add(recognition); 162 | } 163 | return result; 164 | } 165 | } 166 | -------------------------------------------------------------------------------- /android/app/src/main/AndroidManifest.xml: -------------------------------------------------------------------------------- 1 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 20 | 21 | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | -------------------------------------------------------------------------------- /android/app/src/main/assets/coco.txt: -------------------------------------------------------------------------------- 1 | person 2 | bicycle 3 | car 4 | motorbike 5 | aeroplane 6 | bus 7 | train 8 | truck 9 | boat 10 | traffic light 11 | fire hydrant 12 | stop sign 13 | parking meter 14 | bench 15 | bird 16 | cat 17 | dog 18 | horse 19 | sheep 20 | cow 21 | elephant 22 | bear 23 | zebra 24 | giraffe 25 | backpack 26 | umbrella 27 | handbag 28 | tie 29 | suitcase 30 | frisbee 31 | skis 32 | snowboard 33 | sports ball 34 | kite 35 | baseball bat 36 | baseball glove 37 | skateboard 38 | surfboard 39 | tennis racket 40 | bottle 41 | wine glass 42 | cup 43 | fork 44 | knife 45 | spoon 46 | bowl 47 | banana 48 | apple 49 | sandwich 50 | orange 51 | broccoli 52 | carrot 53 | hot dog 54 | pizza 55 | donut 56 | cake 57 | chair 58 | sofa 59 | potted plant 60 | bed 61 | dining table 62 | toilet 63 | tvmonitor 64 | laptop 65 | mouse 66 | remote 67 | keyboard 68 | cell phone 69 | microwave 70 | oven 71 | toaster 72 | sink 73 | refrigerator 74 | book 75 | clock 76 | vase 77 | scissors 78 | teddy bear 79 | hair drier 80 | toothbrush 81 | -------------------------------------------------------------------------------- /android/app/src/main/assets/kite.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/assets/kite.jpg -------------------------------------------------------------------------------- /android/app/src/main/assets/labelmap.txt: -------------------------------------------------------------------------------- 1 | ??? 2 | person 3 | bicycle 4 | car 5 | motorcycle 6 | airplane 7 | bus 8 | train 9 | truck 10 | boat 11 | traffic light 12 | fire hydrant 13 | ??? 14 | stop sign 15 | parking meter 16 | bench 17 | bird 18 | cat 19 | dog 20 | horse 21 | sheep 22 | cow 23 | elephant 24 | bear 25 | zebra 26 | giraffe 27 | ??? 28 | backpack 29 | umbrella 30 | ??? 31 | ??? 32 | handbag 33 | tie 34 | suitcase 35 | frisbee 36 | skis 37 | snowboard 38 | sports ball 39 | kite 40 | baseball bat 41 | baseball glove 42 | skateboard 43 | surfboard 44 | tennis racket 45 | bottle 46 | ??? 47 | wine glass 48 | cup 49 | fork 50 | knife 51 | spoon 52 | bowl 53 | banana 54 | apple 55 | sandwich 56 | orange 57 | broccoli 58 | carrot 59 | hot dog 60 | pizza 61 | donut 62 | cake 63 | chair 64 | couch 65 | potted plant 66 | bed 67 | ??? 68 | dining table 69 | ??? 70 | ??? 71 | toilet 72 | ??? 73 | tv 74 | laptop 75 | mouse 76 | remote 77 | keyboard 78 | cell phone 79 | microwave 80 | oven 81 | toaster 82 | sink 83 | refrigerator 84 | ??? 85 | book 86 | clock 87 | vase 88 | scissors 89 | teddy bear 90 | hair drier 91 | toothbrush 92 | -------------------------------------------------------------------------------- /android/app/src/main/assets/yolov4-416-fp32.tflite: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/assets/yolov4-416-fp32.tflite -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/LegacyCameraConnectionFragment.java: -------------------------------------------------------------------------------- 1 | package org.tensorflow.lite.examples.detection; 2 | 3 | /* 4 | * Copyright 2019 The TensorFlow Authors. All Rights Reserved. 5 | * 6 | * Licensed under the Apache License, Version 2.0 (the "License"); 7 | * you may not use this file except in compliance with the License. 8 | * You may obtain a copy of the License at 9 | * 10 | * http://www.apache.org/licenses/LICENSE-2.0 11 | * 12 | * Unless required by applicable law or agreed to in writing, software 13 | * distributed under the License is distributed on an "AS IS" BASIS, 14 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 15 | * See the License for the specific language governing permissions and 16 | * limitations under the License. 17 | */ 18 | 19 | import android.app.Fragment; 20 | import android.graphics.SurfaceTexture; 21 | import android.hardware.Camera; 22 | import android.hardware.Camera.CameraInfo; 23 | import android.os.Bundle; 24 | import android.os.Handler; 25 | import android.os.HandlerThread; 26 | import android.util.Size; 27 | import android.util.SparseIntArray; 28 | import android.view.LayoutInflater; 29 | import android.view.Surface; 30 | import android.view.TextureView; 31 | import android.view.View; 32 | import android.view.ViewGroup; 33 | import java.io.IOException; 34 | import java.util.List; 35 | import org.tensorflow.lite.examples.detection.customview.AutoFitTextureView; 36 | import org.tensorflow.lite.examples.detection.env.ImageUtils; 37 | import org.tensorflow.lite.examples.detection.env.Logger; 38 | 39 | public class LegacyCameraConnectionFragment extends Fragment { 40 | private static final Logger LOGGER = new Logger(); 41 | /** Conversion from screen rotation to JPEG orientation. */ 42 | private static final SparseIntArray ORIENTATIONS = new SparseIntArray(); 43 | 44 | static { 45 | ORIENTATIONS.append(Surface.ROTATION_0, 90); 46 | ORIENTATIONS.append(Surface.ROTATION_90, 0); 47 | ORIENTATIONS.append(Surface.ROTATION_180, 270); 48 | ORIENTATIONS.append(Surface.ROTATION_270, 180); 49 | } 50 | 51 | private Camera camera; 52 | private Camera.PreviewCallback imageListener; 53 | private Size desiredSize; 54 | /** The layout identifier to inflate for this Fragment. */ 55 | private int layout; 56 | /** An {@link AutoFitTextureView} for camera preview. */ 57 | private AutoFitTextureView textureView; 58 | /** 59 | * {@link TextureView.SurfaceTextureListener} handles several lifecycle events on a {@link 60 | * TextureView}. 61 | */ 62 | private final TextureView.SurfaceTextureListener surfaceTextureListener = 63 | new TextureView.SurfaceTextureListener() { 64 | @Override 65 | public void onSurfaceTextureAvailable( 66 | final SurfaceTexture texture, final int width, final int height) { 67 | 68 | int index = getCameraId(); 69 | camera = Camera.open(index); 70 | 71 | try { 72 | Camera.Parameters parameters = camera.getParameters(); 73 | List focusModes = parameters.getSupportedFocusModes(); 74 | if (focusModes != null 75 | && focusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) { 76 | parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE); 77 | } 78 | List cameraSizes = parameters.getSupportedPreviewSizes(); 79 | Size[] sizes = new Size[cameraSizes.size()]; 80 | int i = 0; 81 | for (Camera.Size size : cameraSizes) { 82 | sizes[i++] = new Size(size.width, size.height); 83 | } 84 | Size previewSize = 85 | CameraConnectionFragment.chooseOptimalSize( 86 | sizes, desiredSize.getWidth(), desiredSize.getHeight()); 87 | parameters.setPreviewSize(previewSize.getWidth(), previewSize.getHeight()); 88 | camera.setDisplayOrientation(90); 89 | camera.setParameters(parameters); 90 | camera.setPreviewTexture(texture); 91 | } catch (IOException exception) { 92 | camera.release(); 93 | } 94 | 95 | camera.setPreviewCallbackWithBuffer(imageListener); 96 | Camera.Size s = camera.getParameters().getPreviewSize(); 97 | camera.addCallbackBuffer(new byte[ImageUtils.getYUVByteSize(s.height, s.width)]); 98 | 99 | textureView.setAspectRatio(s.height, s.width); 100 | 101 | camera.startPreview(); 102 | } 103 | 104 | @Override 105 | public void onSurfaceTextureSizeChanged( 106 | final SurfaceTexture texture, final int width, final int height) {} 107 | 108 | @Override 109 | public boolean onSurfaceTextureDestroyed(final SurfaceTexture texture) { 110 | return true; 111 | } 112 | 113 | @Override 114 | public void onSurfaceTextureUpdated(final SurfaceTexture texture) {} 115 | }; 116 | /** An additional thread for running tasks that shouldn't block the UI. */ 117 | private HandlerThread backgroundThread; 118 | 119 | public LegacyCameraConnectionFragment( 120 | final Camera.PreviewCallback imageListener, final int layout, final Size desiredSize) { 121 | this.imageListener = imageListener; 122 | this.layout = layout; 123 | this.desiredSize = desiredSize; 124 | } 125 | 126 | @Override 127 | public View onCreateView( 128 | final LayoutInflater inflater, final ViewGroup container, final Bundle savedInstanceState) { 129 | return inflater.inflate(layout, container, false); 130 | } 131 | 132 | @Override 133 | public void onViewCreated(final View view, final Bundle savedInstanceState) { 134 | textureView = (AutoFitTextureView) view.findViewById(R.id.texture); 135 | } 136 | 137 | @Override 138 | public void onActivityCreated(final Bundle savedInstanceState) { 139 | super.onActivityCreated(savedInstanceState); 140 | } 141 | 142 | @Override 143 | public void onResume() { 144 | super.onResume(); 145 | startBackgroundThread(); 146 | // When the screen is turned off and turned back on, the SurfaceTexture is already 147 | // available, and "onSurfaceTextureAvailable" will not be called. In that case, we can open 148 | // a camera and start preview from here (otherwise, we wait until the surface is ready in 149 | // the SurfaceTextureListener). 150 | 151 | if (textureView.isAvailable()) { 152 | camera.startPreview(); 153 | } else { 154 | textureView.setSurfaceTextureListener(surfaceTextureListener); 155 | } 156 | } 157 | 158 | @Override 159 | public void onPause() { 160 | stopCamera(); 161 | stopBackgroundThread(); 162 | super.onPause(); 163 | } 164 | 165 | /** Starts a background thread and its {@link Handler}. */ 166 | private void startBackgroundThread() { 167 | backgroundThread = new HandlerThread("CameraBackground"); 168 | backgroundThread.start(); 169 | } 170 | 171 | /** Stops the background thread and its {@link Handler}. */ 172 | private void stopBackgroundThread() { 173 | backgroundThread.quitSafely(); 174 | try { 175 | backgroundThread.join(); 176 | backgroundThread = null; 177 | } catch (final InterruptedException e) { 178 | LOGGER.e(e, "Exception!"); 179 | } 180 | } 181 | 182 | protected void stopCamera() { 183 | if (camera != null) { 184 | camera.stopPreview(); 185 | camera.setPreviewCallback(null); 186 | camera.release(); 187 | camera = null; 188 | } 189 | } 190 | 191 | private int getCameraId() { 192 | CameraInfo ci = new CameraInfo(); 193 | for (int i = 0; i < Camera.getNumberOfCameras(); i++) { 194 | Camera.getCameraInfo(i, ci); 195 | if (ci.facing == CameraInfo.CAMERA_FACING_BACK) return i; 196 | } 197 | return -1; // No camera found 198 | } 199 | } 200 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/MainActivity.java: -------------------------------------------------------------------------------- 1 | package org.tensorflow.lite.examples.detection; 2 | 3 | import androidx.appcompat.app.AppCompatActivity; 4 | 5 | import android.content.Context; 6 | import android.content.Intent; 7 | import android.graphics.Bitmap; 8 | import android.graphics.Canvas; 9 | import android.graphics.Color; 10 | import android.graphics.Matrix; 11 | import android.graphics.Paint; 12 | import android.graphics.RectF; 13 | import android.os.Bundle; 14 | import android.os.Handler; 15 | import android.util.Log; 16 | import android.view.View; 17 | import android.widget.Button; 18 | import android.widget.ImageView; 19 | import android.widget.Toast; 20 | 21 | import org.tensorflow.lite.examples.detection.customview.OverlayView; 22 | import org.tensorflow.lite.examples.detection.env.ImageUtils; 23 | import org.tensorflow.lite.examples.detection.env.Logger; 24 | import org.tensorflow.lite.examples.detection.env.Utils; 25 | import org.tensorflow.lite.examples.detection.tflite.Classifier; 26 | import org.tensorflow.lite.examples.detection.tflite.YoloV4Classifier; 27 | import org.tensorflow.lite.examples.detection.tracking.MultiBoxTracker; 28 | 29 | import java.io.IOException; 30 | import java.util.LinkedList; 31 | import java.util.List; 32 | 33 | public class MainActivity extends AppCompatActivity { 34 | 35 | public static final float MINIMUM_CONFIDENCE_TF_OD_API = 0.5f; 36 | 37 | @Override 38 | protected void onCreate(Bundle savedInstanceState) { 39 | super.onCreate(savedInstanceState); 40 | setContentView(R.layout.activity_main); 41 | 42 | cameraButton = findViewById(R.id.cameraButton); 43 | detectButton = findViewById(R.id.detectButton); 44 | imageView = findViewById(R.id.imageView); 45 | 46 | cameraButton.setOnClickListener(v -> startActivity(new Intent(MainActivity.this, DetectorActivity.class))); 47 | 48 | detectButton.setOnClickListener(v -> { 49 | Handler handler = new Handler(); 50 | 51 | new Thread(() -> { 52 | final List results = detector.recognizeImage(cropBitmap); 53 | handler.post(new Runnable() { 54 | @Override 55 | public void run() { 56 | handleResult(cropBitmap, results); 57 | } 58 | }); 59 | }).start(); 60 | 61 | }); 62 | this.sourceBitmap = Utils.getBitmapFromAsset(MainActivity.this, "kite.jpg"); 63 | 64 | this.cropBitmap = Utils.processBitmap(sourceBitmap, TF_OD_API_INPUT_SIZE); 65 | 66 | this.imageView.setImageBitmap(cropBitmap); 67 | 68 | initBox(); 69 | } 70 | 71 | private static final Logger LOGGER = new Logger(); 72 | 73 | public static final int TF_OD_API_INPUT_SIZE = 416; 74 | 75 | private static final boolean TF_OD_API_IS_QUANTIZED = false; 76 | 77 | private static final String TF_OD_API_MODEL_FILE = "yolov4-416-fp32.tflite"; 78 | 79 | private static final String TF_OD_API_LABELS_FILE = "file:///android_asset/coco.txt"; 80 | 81 | // Minimum detection confidence to track a detection. 82 | private static final boolean MAINTAIN_ASPECT = false; 83 | private Integer sensorOrientation = 90; 84 | 85 | private Classifier detector; 86 | 87 | private Matrix frameToCropTransform; 88 | private Matrix cropToFrameTransform; 89 | private MultiBoxTracker tracker; 90 | private OverlayView trackingOverlay; 91 | 92 | protected int previewWidth = 0; 93 | protected int previewHeight = 0; 94 | 95 | private Bitmap sourceBitmap; 96 | private Bitmap cropBitmap; 97 | 98 | private Button cameraButton, detectButton; 99 | private ImageView imageView; 100 | 101 | private void initBox() { 102 | previewHeight = TF_OD_API_INPUT_SIZE; 103 | previewWidth = TF_OD_API_INPUT_SIZE; 104 | frameToCropTransform = 105 | ImageUtils.getTransformationMatrix( 106 | previewWidth, previewHeight, 107 | TF_OD_API_INPUT_SIZE, TF_OD_API_INPUT_SIZE, 108 | sensorOrientation, MAINTAIN_ASPECT); 109 | 110 | cropToFrameTransform = new Matrix(); 111 | frameToCropTransform.invert(cropToFrameTransform); 112 | 113 | tracker = new MultiBoxTracker(this); 114 | trackingOverlay = findViewById(R.id.tracking_overlay); 115 | trackingOverlay.addCallback( 116 | canvas -> tracker.draw(canvas)); 117 | 118 | tracker.setFrameConfiguration(TF_OD_API_INPUT_SIZE, TF_OD_API_INPUT_SIZE, sensorOrientation); 119 | 120 | try { 121 | detector = 122 | YoloV4Classifier.create( 123 | getAssets(), 124 | TF_OD_API_MODEL_FILE, 125 | TF_OD_API_LABELS_FILE, 126 | TF_OD_API_IS_QUANTIZED); 127 | } catch (final IOException e) { 128 | e.printStackTrace(); 129 | LOGGER.e(e, "Exception initializing classifier!"); 130 | Toast toast = 131 | Toast.makeText( 132 | getApplicationContext(), "Classifier could not be initialized", Toast.LENGTH_SHORT); 133 | toast.show(); 134 | finish(); 135 | } 136 | } 137 | 138 | private void handleResult(Bitmap bitmap, List results) { 139 | final Canvas canvas = new Canvas(bitmap); 140 | final Paint paint = new Paint(); 141 | paint.setColor(Color.RED); 142 | paint.setStyle(Paint.Style.STROKE); 143 | paint.setStrokeWidth(2.0f); 144 | 145 | final List mappedRecognitions = 146 | new LinkedList(); 147 | 148 | for (final Classifier.Recognition result : results) { 149 | final RectF location = result.getLocation(); 150 | if (location != null && result.getConfidence() >= MINIMUM_CONFIDENCE_TF_OD_API) { 151 | canvas.drawRect(location, paint); 152 | // cropToFrameTransform.mapRect(location); 153 | // 154 | // result.setLocation(location); 155 | // mappedRecognitions.add(result); 156 | } 157 | } 158 | // tracker.trackResults(mappedRecognitions, new Random().nextInt()); 159 | // trackingOverlay.postInvalidate(); 160 | imageView.setImageBitmap(bitmap); 161 | } 162 | } 163 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/customview/AutoFitTextureView.java: -------------------------------------------------------------------------------- 1 | /* 2 | * Copyright 2019 The TensorFlow Authors. All Rights Reserved. 3 | * 4 | * Licensed under the Apache License, Version 2.0 (the "License"); 5 | * you may not use this file except in compliance with the License. 6 | * You may obtain a copy of the License at 7 | * 8 | * http://www.apache.org/licenses/LICENSE-2.0 9 | * 10 | * Unless required by applicable law or agreed to in writing, software 11 | * distributed under the License is distributed on an "AS IS" BASIS, 12 | * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 | * See the License for the specific language governing permissions and 14 | * limitations under the License. 15 | */ 16 | 17 | package org.tensorflow.lite.examples.detection.customview; 18 | 19 | import android.content.Context; 20 | import android.util.AttributeSet; 21 | import android.view.TextureView; 22 | 23 | /** A {@link TextureView} that can be adjusted to a specified aspect ratio. */ 24 | public class AutoFitTextureView extends TextureView { 25 | private int ratioWidth = 0; 26 | private int ratioHeight = 0; 27 | 28 | public AutoFitTextureView(final Context context) { 29 | this(context, null); 30 | } 31 | 32 | public AutoFitTextureView(final Context context, final AttributeSet attrs) { 33 | this(context, attrs, 0); 34 | } 35 | 36 | public AutoFitTextureView(final Context context, final AttributeSet attrs, final int defStyle) { 37 | super(context, attrs, defStyle); 38 | } 39 | 40 | /** 41 | * Sets the aspect ratio for this view. The size of the view will be measured based on the ratio 42 | * calculated from the parameters. Note that the actual sizes of parameters don't matter, that is, 43 | * calling setAspectRatio(2, 3) and setAspectRatio(4, 6) make the same result. 44 | * 45 | * @param width Relative horizontal size 46 | * @param height Relative vertical size 47 | */ 48 | public void setAspectRatio(final int width, final int height) { 49 | if (width < 0 || height < 0) { 50 | throw new IllegalArgumentException("Size cannot be negative."); 51 | } 52 | ratioWidth = width; 53 | ratioHeight = height; 54 | requestLayout(); 55 | } 56 | 57 | @Override 58 | protected void onMeasure(final int widthMeasureSpec, final int heightMeasureSpec) { 59 | super.onMeasure(widthMeasureSpec, heightMeasureSpec); 60 | final int width = MeasureSpec.getSize(widthMeasureSpec); 61 | final int height = MeasureSpec.getSize(heightMeasureSpec); 62 | if (0 == ratioWidth || 0 == ratioHeight) { 63 | setMeasuredDimension(width, height); 64 | } else { 65 | if (width < height * ratioWidth / ratioHeight) { 66 | setMeasuredDimension(width, width * ratioHeight / ratioWidth); 67 | } else { 68 | setMeasuredDimension(height * ratioWidth / ratioHeight, height); 69 | } 70 | } 71 | } 72 | } 73 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/customview/OverlayView.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.customview; 17 | 18 | import android.content.Context; 19 | import android.graphics.Canvas; 20 | import android.util.AttributeSet; 21 | import android.view.View; 22 | import java.util.LinkedList; 23 | import java.util.List; 24 | 25 | /** A simple View providing a render callback to other classes. */ 26 | public class OverlayView extends View { 27 | private final List callbacks = new LinkedList(); 28 | 29 | public OverlayView(final Context context, final AttributeSet attrs) { 30 | super(context, attrs); 31 | } 32 | 33 | public void addCallback(final DrawCallback callback) { 34 | callbacks.add(callback); 35 | } 36 | 37 | @Override 38 | public synchronized void draw(final Canvas canvas) { 39 | for (final DrawCallback callback : callbacks) { 40 | callback.drawCallback(canvas); 41 | } 42 | } 43 | 44 | /** Interface defining the callback for client classes. */ 45 | public interface DrawCallback { 46 | public void drawCallback(final Canvas canvas); 47 | } 48 | } 49 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/customview/RecognitionScoreView.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.customview; 17 | 18 | import android.content.Context; 19 | import android.graphics.Canvas; 20 | import android.graphics.Paint; 21 | import android.util.AttributeSet; 22 | import android.util.TypedValue; 23 | import android.view.View; 24 | import java.util.List; 25 | import org.tensorflow.lite.examples.detection.tflite.Classifier.Recognition; 26 | 27 | public class RecognitionScoreView extends View implements ResultsView { 28 | private static final float TEXT_SIZE_DIP = 14; 29 | private final float textSizePx; 30 | private final Paint fgPaint; 31 | private final Paint bgPaint; 32 | private List results; 33 | 34 | public RecognitionScoreView(final Context context, final AttributeSet set) { 35 | super(context, set); 36 | 37 | textSizePx = 38 | TypedValue.applyDimension( 39 | TypedValue.COMPLEX_UNIT_DIP, TEXT_SIZE_DIP, getResources().getDisplayMetrics()); 40 | fgPaint = new Paint(); 41 | fgPaint.setTextSize(textSizePx); 42 | 43 | bgPaint = new Paint(); 44 | bgPaint.setColor(0xcc4285f4); 45 | } 46 | 47 | @Override 48 | public void setResults(final List results) { 49 | this.results = results; 50 | postInvalidate(); 51 | } 52 | 53 | @Override 54 | public void onDraw(final Canvas canvas) { 55 | final int x = 10; 56 | int y = (int) (fgPaint.getTextSize() * 1.5f); 57 | 58 | canvas.drawPaint(bgPaint); 59 | 60 | if (results != null) { 61 | for (final Recognition recog : results) { 62 | canvas.drawText(recog.getTitle() + ": " + recog.getConfidence(), x, y, fgPaint); 63 | y += (int) (fgPaint.getTextSize() * 1.5f); 64 | } 65 | } 66 | } 67 | } 68 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/customview/ResultsView.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.customview; 17 | 18 | import java.util.List; 19 | import org.tensorflow.lite.examples.detection.tflite.Classifier.Recognition; 20 | 21 | public interface ResultsView { 22 | public void setResults(final List results); 23 | } 24 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/env/BorderedText.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.env; 17 | 18 | import android.graphics.Canvas; 19 | import android.graphics.Color; 20 | import android.graphics.Paint; 21 | import android.graphics.Paint.Align; 22 | import android.graphics.Paint.Style; 23 | import android.graphics.Rect; 24 | import android.graphics.Typeface; 25 | import java.util.Vector; 26 | 27 | /** A class that encapsulates the tedious bits of rendering legible, bordered text onto a canvas. */ 28 | public class BorderedText { 29 | private final Paint interiorPaint; 30 | private final Paint exteriorPaint; 31 | 32 | private final float textSize; 33 | 34 | /** 35 | * Creates a left-aligned bordered text object with a white interior, and a black exterior with 36 | * the specified text size. 37 | * 38 | * @param textSize text size in pixels 39 | */ 40 | public BorderedText(final float textSize) { 41 | this(Color.WHITE, Color.BLACK, textSize); 42 | } 43 | 44 | /** 45 | * Create a bordered text object with the specified interior and exterior colors, text size and 46 | * alignment. 47 | * 48 | * @param interiorColor the interior text color 49 | * @param exteriorColor the exterior text color 50 | * @param textSize text size in pixels 51 | */ 52 | public BorderedText(final int interiorColor, final int exteriorColor, final float textSize) { 53 | interiorPaint = new Paint(); 54 | interiorPaint.setTextSize(textSize); 55 | interiorPaint.setColor(interiorColor); 56 | interiorPaint.setStyle(Style.FILL); 57 | interiorPaint.setAntiAlias(false); 58 | interiorPaint.setAlpha(255); 59 | 60 | exteriorPaint = new Paint(); 61 | exteriorPaint.setTextSize(textSize); 62 | exteriorPaint.setColor(exteriorColor); 63 | exteriorPaint.setStyle(Style.FILL_AND_STROKE); 64 | exteriorPaint.setStrokeWidth(textSize / 8); 65 | exteriorPaint.setAntiAlias(false); 66 | exteriorPaint.setAlpha(255); 67 | 68 | this.textSize = textSize; 69 | } 70 | 71 | public void setTypeface(Typeface typeface) { 72 | interiorPaint.setTypeface(typeface); 73 | exteriorPaint.setTypeface(typeface); 74 | } 75 | 76 | public void drawText(final Canvas canvas, final float posX, final float posY, final String text) { 77 | canvas.drawText(text, posX, posY, exteriorPaint); 78 | canvas.drawText(text, posX, posY, interiorPaint); 79 | } 80 | 81 | public void drawText( 82 | final Canvas canvas, final float posX, final float posY, final String text, Paint bgPaint) { 83 | 84 | float width = exteriorPaint.measureText(text); 85 | float textSize = exteriorPaint.getTextSize(); 86 | Paint paint = new Paint(bgPaint); 87 | paint.setStyle(Paint.Style.FILL); 88 | paint.setAlpha(160); 89 | canvas.drawRect(posX, (posY + (int) (textSize)), (posX + (int) (width)), posY, paint); 90 | 91 | canvas.drawText(text, posX, (posY + textSize), interiorPaint); 92 | } 93 | 94 | public void drawLines(Canvas canvas, final float posX, final float posY, Vector lines) { 95 | int lineNum = 0; 96 | for (final String line : lines) { 97 | drawText(canvas, posX, posY - getTextSize() * (lines.size() - lineNum - 1), line); 98 | ++lineNum; 99 | } 100 | } 101 | 102 | public void setInteriorColor(final int color) { 103 | interiorPaint.setColor(color); 104 | } 105 | 106 | public void setExteriorColor(final int color) { 107 | exteriorPaint.setColor(color); 108 | } 109 | 110 | public float getTextSize() { 111 | return textSize; 112 | } 113 | 114 | public void setAlpha(final int alpha) { 115 | interiorPaint.setAlpha(alpha); 116 | exteriorPaint.setAlpha(alpha); 117 | } 118 | 119 | public void getTextBounds( 120 | final String line, final int index, final int count, final Rect lineBounds) { 121 | interiorPaint.getTextBounds(line, index, count, lineBounds); 122 | } 123 | 124 | public void setTextAlign(final Align align) { 125 | interiorPaint.setTextAlign(align); 126 | exteriorPaint.setTextAlign(align); 127 | } 128 | } 129 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/env/ImageUtils.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.env; 17 | 18 | import android.graphics.Bitmap; 19 | import android.graphics.Matrix; 20 | import android.os.Environment; 21 | import java.io.File; 22 | import java.io.FileOutputStream; 23 | 24 | /** Utility class for manipulating images. */ 25 | public class ImageUtils { 26 | // This value is 2 ^ 18 - 1, and is used to clamp the RGB values before their ranges 27 | // are normalized to eight bits. 28 | static final int kMaxChannelValue = 262143; 29 | 30 | @SuppressWarnings("unused") 31 | private static final Logger LOGGER = new Logger(); 32 | 33 | /** 34 | * Utility method to compute the allocated size in bytes of a YUV420SP image of the given 35 | * dimensions. 36 | */ 37 | public static int getYUVByteSize(final int width, final int height) { 38 | // The luminance plane requires 1 byte per pixel. 39 | final int ySize = width * height; 40 | 41 | // The UV plane works on 2x2 blocks, so dimensions with odd size must be rounded up. 42 | // Each 2x2 block takes 2 bytes to encode, one each for U and V. 43 | final int uvSize = ((width + 1) / 2) * ((height + 1) / 2) * 2; 44 | 45 | return ySize + uvSize; 46 | } 47 | 48 | /** 49 | * Saves a Bitmap object to disk for analysis. 50 | * 51 | * @param bitmap The bitmap to save. 52 | */ 53 | public static void saveBitmap(final Bitmap bitmap) { 54 | saveBitmap(bitmap, "preview.png"); 55 | } 56 | 57 | /** 58 | * Saves a Bitmap object to disk for analysis. 59 | * 60 | * @param bitmap The bitmap to save. 61 | * @param filename The location to save the bitmap to. 62 | */ 63 | public static void saveBitmap(final Bitmap bitmap, final String filename) { 64 | final String root = 65 | Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "tensorflow"; 66 | LOGGER.i("Saving %dx%d bitmap to %s.", bitmap.getWidth(), bitmap.getHeight(), root); 67 | final File myDir = new File(root); 68 | 69 | if (!myDir.mkdirs()) { 70 | LOGGER.i("Make dir failed"); 71 | } 72 | 73 | final String fname = filename; 74 | final File file = new File(myDir, fname); 75 | if (file.exists()) { 76 | file.delete(); 77 | } 78 | try { 79 | final FileOutputStream out = new FileOutputStream(file); 80 | bitmap.compress(Bitmap.CompressFormat.PNG, 99, out); 81 | out.flush(); 82 | out.close(); 83 | } catch (final Exception e) { 84 | LOGGER.e(e, "Exception!"); 85 | } 86 | } 87 | 88 | public static void convertYUV420SPToARGB8888(byte[] input, int width, int height, int[] output) { 89 | final int frameSize = width * height; 90 | for (int j = 0, yp = 0; j < height; j++) { 91 | int uvp = frameSize + (j >> 1) * width; 92 | int u = 0; 93 | int v = 0; 94 | 95 | for (int i = 0; i < width; i++, yp++) { 96 | int y = 0xff & input[yp]; 97 | if ((i & 1) == 0) { 98 | v = 0xff & input[uvp++]; 99 | u = 0xff & input[uvp++]; 100 | } 101 | 102 | output[yp] = YUV2RGB(y, u, v); 103 | } 104 | } 105 | } 106 | 107 | private static int YUV2RGB(int y, int u, int v) { 108 | // Adjust and check YUV values 109 | y = (y - 16) < 0 ? 0 : (y - 16); 110 | u -= 128; 111 | v -= 128; 112 | 113 | // This is the floating point equivalent. We do the conversion in integer 114 | // because some Android devices do not have floating point in hardware. 115 | // nR = (int)(1.164 * nY + 2.018 * nU); 116 | // nG = (int)(1.164 * nY - 0.813 * nV - 0.391 * nU); 117 | // nB = (int)(1.164 * nY + 1.596 * nV); 118 | int y1192 = 1192 * y; 119 | int r = (y1192 + 1634 * v); 120 | int g = (y1192 - 833 * v - 400 * u); 121 | int b = (y1192 + 2066 * u); 122 | 123 | // Clipping RGB values to be inside boundaries [ 0 , kMaxChannelValue ] 124 | r = r > kMaxChannelValue ? kMaxChannelValue : (r < 0 ? 0 : r); 125 | g = g > kMaxChannelValue ? kMaxChannelValue : (g < 0 ? 0 : g); 126 | b = b > kMaxChannelValue ? kMaxChannelValue : (b < 0 ? 0 : b); 127 | 128 | return 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff); 129 | } 130 | 131 | public static void convertYUV420ToARGB8888( 132 | byte[] yData, 133 | byte[] uData, 134 | byte[] vData, 135 | int width, 136 | int height, 137 | int yRowStride, 138 | int uvRowStride, 139 | int uvPixelStride, 140 | int[] out) { 141 | int yp = 0; 142 | for (int j = 0; j < height; j++) { 143 | int pY = yRowStride * j; 144 | int pUV = uvRowStride * (j >> 1); 145 | 146 | for (int i = 0; i < width; i++) { 147 | int uv_offset = pUV + (i >> 1) * uvPixelStride; 148 | 149 | out[yp++] = YUV2RGB(0xff & yData[pY + i], 0xff & uData[uv_offset], 0xff & vData[uv_offset]); 150 | } 151 | } 152 | } 153 | 154 | /** 155 | * Returns a transformation matrix from one reference frame into another. Handles cropping (if 156 | * maintaining aspect ratio is desired) and rotation. 157 | * 158 | * @param srcWidth Width of source frame. 159 | * @param srcHeight Height of source frame. 160 | * @param dstWidth Width of destination frame. 161 | * @param dstHeight Height of destination frame. 162 | * @param applyRotation Amount of rotation to apply from one frame to another. Must be a multiple 163 | * of 90. 164 | * @param maintainAspectRatio If true, will ensure that scaling in x and y remains constant, 165 | * cropping the image if necessary. 166 | * @return The transformation fulfilling the desired requirements. 167 | */ 168 | public static Matrix getTransformationMatrix( 169 | final int srcWidth, 170 | final int srcHeight, 171 | final int dstWidth, 172 | final int dstHeight, 173 | final int applyRotation, 174 | final boolean maintainAspectRatio) { 175 | final Matrix matrix = new Matrix(); 176 | 177 | if (applyRotation != 0) { 178 | if (applyRotation % 90 != 0) { 179 | LOGGER.w("Rotation of %d % 90 != 0", applyRotation); 180 | } 181 | 182 | // Translate so center of image is at origin. 183 | matrix.postTranslate(-srcWidth / 2.0f, -srcHeight / 2.0f); 184 | 185 | // Rotate around origin. 186 | matrix.postRotate(applyRotation); 187 | } 188 | 189 | // Account for the already applied rotation, if any, and then determine how 190 | // much scaling is needed for each axis. 191 | final boolean transpose = (Math.abs(applyRotation) + 90) % 180 == 0; 192 | 193 | final int inWidth = transpose ? srcHeight : srcWidth; 194 | final int inHeight = transpose ? srcWidth : srcHeight; 195 | 196 | // Apply scaling if necessary. 197 | if (inWidth != dstWidth || inHeight != dstHeight) { 198 | final float scaleFactorX = dstWidth / (float) inWidth; 199 | final float scaleFactorY = dstHeight / (float) inHeight; 200 | 201 | if (maintainAspectRatio) { 202 | // Scale by minimum factor so that dst is filled completely while 203 | // maintaining the aspect ratio. Some image may fall off the edge. 204 | final float scaleFactor = Math.max(scaleFactorX, scaleFactorY); 205 | matrix.postScale(scaleFactor, scaleFactor); 206 | } else { 207 | // Scale exactly to fill dst from src. 208 | matrix.postScale(scaleFactorX, scaleFactorY); 209 | } 210 | } 211 | 212 | if (applyRotation != 0) { 213 | // Translate back from origin centered reference to destination frame. 214 | matrix.postTranslate(dstWidth / 2.0f, dstHeight / 2.0f); 215 | } 216 | 217 | return matrix; 218 | } 219 | } 220 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/env/Logger.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.env; 17 | 18 | import android.util.Log; 19 | import java.util.HashSet; 20 | import java.util.Set; 21 | 22 | /** Wrapper for the platform log function, allows convenient message prefixing and log disabling. */ 23 | public final class Logger { 24 | private static final String DEFAULT_TAG = "tensorflow"; 25 | private static final int DEFAULT_MIN_LOG_LEVEL = Log.DEBUG; 26 | 27 | // Classes to be ignored when examining the stack trace 28 | private static final Set IGNORED_CLASS_NAMES; 29 | 30 | static { 31 | IGNORED_CLASS_NAMES = new HashSet(3); 32 | IGNORED_CLASS_NAMES.add("dalvik.system.VMStack"); 33 | IGNORED_CLASS_NAMES.add("java.lang.Thread"); 34 | IGNORED_CLASS_NAMES.add(Logger.class.getCanonicalName()); 35 | } 36 | 37 | private final String tag; 38 | private final String messagePrefix; 39 | private int minLogLevel = DEFAULT_MIN_LOG_LEVEL; 40 | 41 | /** 42 | * Creates a Logger using the class name as the message prefix. 43 | * 44 | * @param clazz the simple name of this class is used as the message prefix. 45 | */ 46 | public Logger(final Class clazz) { 47 | this(clazz.getSimpleName()); 48 | } 49 | 50 | /** 51 | * Creates a Logger using the specified message prefix. 52 | * 53 | * @param messagePrefix is prepended to the text of every message. 54 | */ 55 | public Logger(final String messagePrefix) { 56 | this(DEFAULT_TAG, messagePrefix); 57 | } 58 | 59 | /** 60 | * Creates a Logger with a custom tag and a custom message prefix. If the message prefix is set to 61 | * 62 | *
null
63 | * 64 | * , the caller's class name is used as the prefix. 65 | * 66 | * @param tag identifies the source of a log message. 67 | * @param messagePrefix prepended to every message if non-null. If null, the name of the caller is 68 | * being used 69 | */ 70 | public Logger(final String tag, final String messagePrefix) { 71 | this.tag = tag; 72 | final String prefix = messagePrefix == null ? getCallerSimpleName() : messagePrefix; 73 | this.messagePrefix = (prefix.length() > 0) ? prefix + ": " : prefix; 74 | } 75 | 76 | /** Creates a Logger using the caller's class name as the message prefix. */ 77 | public Logger() { 78 | this(DEFAULT_TAG, null); 79 | } 80 | 81 | /** Creates a Logger using the caller's class name as the message prefix. */ 82 | public Logger(final int minLogLevel) { 83 | this(DEFAULT_TAG, null); 84 | this.minLogLevel = minLogLevel; 85 | } 86 | 87 | /** 88 | * Return caller's simple name. 89 | * 90 | *

Android getStackTrace() returns an array that looks like this: stackTrace[0]: 91 | * dalvik.system.VMStack stackTrace[1]: java.lang.Thread stackTrace[2]: 92 | * com.google.android.apps.unveil.env.UnveilLogger stackTrace[3]: 93 | * com.google.android.apps.unveil.BaseApplication 94 | * 95 | *

This function returns the simple version of the first non-filtered name. 96 | * 97 | * @return caller's simple name 98 | */ 99 | private static String getCallerSimpleName() { 100 | // Get the current callstack so we can pull the class of the caller off of it. 101 | final StackTraceElement[] stackTrace = Thread.currentThread().getStackTrace(); 102 | 103 | for (final StackTraceElement elem : stackTrace) { 104 | final String className = elem.getClassName(); 105 | if (!IGNORED_CLASS_NAMES.contains(className)) { 106 | // We're only interested in the simple name of the class, not the complete package. 107 | final String[] classParts = className.split("\\."); 108 | return classParts[classParts.length - 1]; 109 | } 110 | } 111 | 112 | return Logger.class.getSimpleName(); 113 | } 114 | 115 | public void setMinLogLevel(final int minLogLevel) { 116 | this.minLogLevel = minLogLevel; 117 | } 118 | 119 | public boolean isLoggable(final int logLevel) { 120 | return logLevel >= minLogLevel || Log.isLoggable(tag, logLevel); 121 | } 122 | 123 | private String toMessage(final String format, final Object... args) { 124 | return messagePrefix + (args.length > 0 ? String.format(format, args) : format); 125 | } 126 | 127 | public void v(final String format, final Object... args) { 128 | if (isLoggable(Log.VERBOSE)) { 129 | Log.v(tag, toMessage(format, args)); 130 | } 131 | } 132 | 133 | public void v(final Throwable t, final String format, final Object... args) { 134 | if (isLoggable(Log.VERBOSE)) { 135 | Log.v(tag, toMessage(format, args), t); 136 | } 137 | } 138 | 139 | public void d(final String format, final Object... args) { 140 | if (isLoggable(Log.DEBUG)) { 141 | Log.d(tag, toMessage(format, args)); 142 | } 143 | } 144 | 145 | public void d(final Throwable t, final String format, final Object... args) { 146 | if (isLoggable(Log.DEBUG)) { 147 | Log.d(tag, toMessage(format, args), t); 148 | } 149 | } 150 | 151 | public void i(final String format, final Object... args) { 152 | if (isLoggable(Log.INFO)) { 153 | Log.i(tag, toMessage(format, args)); 154 | } 155 | } 156 | 157 | public void i(final Throwable t, final String format, final Object... args) { 158 | if (isLoggable(Log.INFO)) { 159 | Log.i(tag, toMessage(format, args), t); 160 | } 161 | } 162 | 163 | public void w(final String format, final Object... args) { 164 | if (isLoggable(Log.WARN)) { 165 | Log.w(tag, toMessage(format, args)); 166 | } 167 | } 168 | 169 | public void w(final Throwable t, final String format, final Object... args) { 170 | if (isLoggable(Log.WARN)) { 171 | Log.w(tag, toMessage(format, args), t); 172 | } 173 | } 174 | 175 | public void e(final String format, final Object... args) { 176 | if (isLoggable(Log.ERROR)) { 177 | Log.e(tag, toMessage(format, args)); 178 | } 179 | } 180 | 181 | public void e(final Throwable t, final String format, final Object... args) { 182 | if (isLoggable(Log.ERROR)) { 183 | Log.e(tag, toMessage(format, args), t); 184 | } 185 | } 186 | } 187 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/env/Size.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.env; 17 | 18 | import android.graphics.Bitmap; 19 | import android.text.TextUtils; 20 | import java.io.Serializable; 21 | import java.util.ArrayList; 22 | import java.util.List; 23 | 24 | /** Size class independent of a Camera object. */ 25 | public class Size implements Comparable, Serializable { 26 | 27 | // 1.4 went out with this UID so we'll need to maintain it to preserve pending queries when 28 | // upgrading. 29 | public static final long serialVersionUID = 7689808733290872361L; 30 | 31 | public final int width; 32 | public final int height; 33 | 34 | public Size(final int width, final int height) { 35 | this.width = width; 36 | this.height = height; 37 | } 38 | 39 | public Size(final Bitmap bmp) { 40 | this.width = bmp.getWidth(); 41 | this.height = bmp.getHeight(); 42 | } 43 | 44 | /** 45 | * Rotate a size by the given number of degrees. 46 | * 47 | * @param size Size to rotate. 48 | * @param rotation Degrees {0, 90, 180, 270} to rotate the size. 49 | * @return Rotated size. 50 | */ 51 | public static Size getRotatedSize(final Size size, final int rotation) { 52 | if (rotation % 180 != 0) { 53 | // The phone is portrait, therefore the camera is sideways and frame should be rotated. 54 | return new Size(size.height, size.width); 55 | } 56 | return size; 57 | } 58 | 59 | public static Size parseFromString(String sizeString) { 60 | if (TextUtils.isEmpty(sizeString)) { 61 | return null; 62 | } 63 | 64 | sizeString = sizeString.trim(); 65 | 66 | // The expected format is "x". 67 | final String[] components = sizeString.split("x"); 68 | if (components.length == 2) { 69 | try { 70 | final int width = Integer.parseInt(components[0]); 71 | final int height = Integer.parseInt(components[1]); 72 | return new Size(width, height); 73 | } catch (final NumberFormatException e) { 74 | return null; 75 | } 76 | } else { 77 | return null; 78 | } 79 | } 80 | 81 | public static List sizeStringToList(final String sizes) { 82 | final List sizeList = new ArrayList(); 83 | if (sizes != null) { 84 | final String[] pairs = sizes.split(","); 85 | for (final String pair : pairs) { 86 | final Size size = Size.parseFromString(pair); 87 | if (size != null) { 88 | sizeList.add(size); 89 | } 90 | } 91 | } 92 | return sizeList; 93 | } 94 | 95 | public static String sizeListToString(final List sizes) { 96 | String sizesString = ""; 97 | if (sizes != null && sizes.size() > 0) { 98 | sizesString = sizes.get(0).toString(); 99 | for (int i = 1; i < sizes.size(); i++) { 100 | sizesString += "," + sizes.get(i).toString(); 101 | } 102 | } 103 | return sizesString; 104 | } 105 | 106 | public static final String dimensionsAsString(final int width, final int height) { 107 | return width + "x" + height; 108 | } 109 | 110 | public final float aspectRatio() { 111 | return (float) width / (float) height; 112 | } 113 | 114 | @Override 115 | public int compareTo(final Size other) { 116 | return width * height - other.width * other.height; 117 | } 118 | 119 | @Override 120 | public boolean equals(final Object other) { 121 | if (other == null) { 122 | return false; 123 | } 124 | 125 | if (!(other instanceof Size)) { 126 | return false; 127 | } 128 | 129 | final Size otherSize = (Size) other; 130 | return (width == otherSize.width && height == otherSize.height); 131 | } 132 | 133 | @Override 134 | public int hashCode() { 135 | return width * 32713 + height; 136 | } 137 | 138 | @Override 139 | public String toString() { 140 | return dimensionsAsString(width, height); 141 | } 142 | } 143 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/env/Utils.java: -------------------------------------------------------------------------------- 1 | package org.tensorflow.lite.examples.detection.env; 2 | 3 | import android.content.Context; 4 | import android.content.res.AssetFileDescriptor; 5 | import android.content.res.AssetManager; 6 | import android.graphics.Bitmap; 7 | import android.graphics.BitmapFactory; 8 | import android.graphics.Canvas; 9 | import android.graphics.Matrix; 10 | import android.os.Environment; 11 | import android.util.Log; 12 | 13 | import org.tensorflow.lite.examples.detection.MainActivity; 14 | 15 | import java.io.File; 16 | import java.io.FileInputStream; 17 | import java.io.FileOutputStream; 18 | import java.io.IOException; 19 | import java.io.InputStream; 20 | import java.io.OutputStreamWriter; 21 | import java.nio.MappedByteBuffer; 22 | import java.nio.channels.FileChannel; 23 | 24 | public class Utils { 25 | 26 | /** 27 | * Memory-map the model file in Assets. 28 | */ 29 | public static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename) 30 | throws IOException { 31 | AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename); 32 | FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor()); 33 | FileChannel fileChannel = inputStream.getChannel(); 34 | long startOffset = fileDescriptor.getStartOffset(); 35 | long declaredLength = fileDescriptor.getDeclaredLength(); 36 | return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength); 37 | } 38 | 39 | public static void softmax(final float[] vals) { 40 | float max = Float.NEGATIVE_INFINITY; 41 | for (final float val : vals) { 42 | max = Math.max(max, val); 43 | } 44 | float sum = 0.0f; 45 | for (int i = 0; i < vals.length; ++i) { 46 | vals[i] = (float) Math.exp(vals[i] - max); 47 | sum += vals[i]; 48 | } 49 | for (int i = 0; i < vals.length; ++i) { 50 | vals[i] = vals[i] / sum; 51 | } 52 | } 53 | 54 | public static float expit(final float x) { 55 | return (float) (1. / (1. + Math.exp(-x))); 56 | } 57 | 58 | // public static Bitmap scale(Context context, String filePath) { 59 | // AssetManager assetManager = context.getAssets(); 60 | // 61 | // InputStream istr; 62 | // Bitmap bitmap = null; 63 | // try { 64 | // istr = assetManager.open(filePath); 65 | // bitmap = BitmapFactory.decodeStream(istr); 66 | // bitmap = Bitmap.createScaledBitmap(bitmap, MainActivity.TF_OD_API_INPUT_SIZE, MainActivity.TF_OD_API_INPUT_SIZE, false); 67 | // } catch (IOException e) { 68 | // // handle exception 69 | // Log.e("getBitmapFromAsset", "getBitmapFromAsset: " + e.getMessage()); 70 | // } 71 | // 72 | // return bitmap; 73 | // } 74 | 75 | public static Bitmap getBitmapFromAsset(Context context, String filePath) { 76 | AssetManager assetManager = context.getAssets(); 77 | 78 | InputStream istr; 79 | Bitmap bitmap = null; 80 | try { 81 | istr = assetManager.open(filePath); 82 | bitmap = BitmapFactory.decodeStream(istr); 83 | // return bitmap.copy(Bitmap.Config.ARGB_8888,true); 84 | } catch (IOException e) { 85 | // handle exception 86 | Log.e("getBitmapFromAsset", "getBitmapFromAsset: " + e.getMessage()); 87 | } 88 | 89 | return bitmap; 90 | } 91 | 92 | /** 93 | * Returns a transformation matrix from one reference frame into another. 94 | * Handles cropping (if maintaining aspect ratio is desired) and rotation. 95 | * 96 | * @param srcWidth Width of source frame. 97 | * @param srcHeight Height of source frame. 98 | * @param dstWidth Width of destination frame. 99 | * @param dstHeight Height of destination frame. 100 | * @param applyRotation Amount of rotation to apply from one frame to another. 101 | * Must be a multiple of 90. 102 | * @param maintainAspectRatio If true, will ensure that scaling in x and y remains constant, 103 | * cropping the image if necessary. 104 | * @return The transformation fulfilling the desired requirements. 105 | */ 106 | public static Matrix getTransformationMatrix( 107 | final int srcWidth, 108 | final int srcHeight, 109 | final int dstWidth, 110 | final int dstHeight, 111 | final int applyRotation, 112 | final boolean maintainAspectRatio) { 113 | final Matrix matrix = new Matrix(); 114 | 115 | if (applyRotation != 0) { 116 | // Translate so center of image is at origin. 117 | matrix.postTranslate(-srcWidth / 2.0f, -srcHeight / 2.0f); 118 | 119 | // Rotate around origin. 120 | matrix.postRotate(applyRotation); 121 | } 122 | 123 | // Account for the already applied rotation, if any, and then determine how 124 | // much scaling is needed for each axis. 125 | final boolean transpose = (Math.abs(applyRotation) + 90) % 180 == 0; 126 | 127 | final int inWidth = transpose ? srcHeight : srcWidth; 128 | final int inHeight = transpose ? srcWidth : srcHeight; 129 | 130 | // Apply scaling if necessary. 131 | if (inWidth != dstWidth || inHeight != dstHeight) { 132 | final float scaleFactorX = dstWidth / (float) inWidth; 133 | final float scaleFactorY = dstHeight / (float) inHeight; 134 | 135 | if (maintainAspectRatio) { 136 | // Scale by minimum factor so that dst is filled completely while 137 | // maintaining the aspect ratio. Some image may fall off the edge. 138 | final float scaleFactor = Math.max(scaleFactorX, scaleFactorY); 139 | matrix.postScale(scaleFactor, scaleFactor); 140 | } else { 141 | // Scale exactly to fill dst from src. 142 | matrix.postScale(scaleFactorX, scaleFactorY); 143 | } 144 | } 145 | 146 | if (applyRotation != 0) { 147 | // Translate back from origin centered reference to destination frame. 148 | matrix.postTranslate(dstWidth / 2.0f, dstHeight / 2.0f); 149 | } 150 | 151 | return matrix; 152 | } 153 | 154 | public static Bitmap processBitmap(Bitmap source, int size){ 155 | 156 | int image_height = source.getHeight(); 157 | int image_width = source.getWidth(); 158 | 159 | Bitmap croppedBitmap = Bitmap.createBitmap(size, size, Bitmap.Config.ARGB_8888); 160 | 161 | Matrix frameToCropTransformations = getTransformationMatrix(image_width,image_height,size,size,0,false); 162 | Matrix cropToFrameTransformations = new Matrix(); 163 | frameToCropTransformations.invert(cropToFrameTransformations); 164 | 165 | final Canvas canvas = new Canvas(croppedBitmap); 166 | canvas.drawBitmap(source, frameToCropTransformations, null); 167 | 168 | return croppedBitmap; 169 | } 170 | 171 | public static void writeToFile(String data, Context context) { 172 | try { 173 | String baseDir = Environment.getExternalStorageDirectory().getAbsolutePath(); 174 | String fileName = "myFile.txt"; 175 | 176 | File file = new File(baseDir + File.separator + fileName); 177 | 178 | FileOutputStream stream = new FileOutputStream(file); 179 | try { 180 | stream.write(data.getBytes()); 181 | } finally { 182 | stream.close(); 183 | } 184 | } catch (IOException e) { 185 | Log.e("Exception", "File write failed: " + e.toString()); 186 | } 187 | } 188 | } 189 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/tflite/Classifier.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.tflite; 17 | 18 | import android.graphics.Bitmap; 19 | import android.graphics.RectF; 20 | 21 | import java.util.List; 22 | 23 | /** 24 | * Generic interface for interacting with different recognition engines. 25 | */ 26 | public interface Classifier { 27 | List recognizeImage(Bitmap bitmap); 28 | 29 | void enableStatLogging(final boolean debug); 30 | 31 | String getStatString(); 32 | 33 | void close(); 34 | 35 | void setNumThreads(int num_threads); 36 | 37 | void setUseNNAPI(boolean isChecked); 38 | 39 | abstract float getObjThresh(); 40 | 41 | /** 42 | * An immutable result returned by a Classifier describing what was recognized. 43 | */ 44 | public class Recognition { 45 | /** 46 | * A unique identifier for what has been recognized. Specific to the class, not the instance of 47 | * the object. 48 | */ 49 | private final String id; 50 | 51 | /** 52 | * Display name for the recognition. 53 | */ 54 | private final String title; 55 | 56 | /** 57 | * A sortable score for how good the recognition is relative to others. Higher should be better. 58 | */ 59 | private final Float confidence; 60 | 61 | /** 62 | * Optional location within the source image for the location of the recognized object. 63 | */ 64 | private RectF location; 65 | 66 | private int detectedClass; 67 | 68 | public Recognition( 69 | final String id, final String title, final Float confidence, final RectF location) { 70 | this.id = id; 71 | this.title = title; 72 | this.confidence = confidence; 73 | this.location = location; 74 | } 75 | 76 | public Recognition(final String id, final String title, final Float confidence, final RectF location, int detectedClass) { 77 | this.id = id; 78 | this.title = title; 79 | this.confidence = confidence; 80 | this.location = location; 81 | this.detectedClass = detectedClass; 82 | } 83 | 84 | public String getId() { 85 | return id; 86 | } 87 | 88 | public String getTitle() { 89 | return title; 90 | } 91 | 92 | public Float getConfidence() { 93 | return confidence; 94 | } 95 | 96 | public RectF getLocation() { 97 | return new RectF(location); 98 | } 99 | 100 | public void setLocation(RectF location) { 101 | this.location = location; 102 | } 103 | 104 | public int getDetectedClass() { 105 | return detectedClass; 106 | } 107 | 108 | public void setDetectedClass(int detectedClass) { 109 | this.detectedClass = detectedClass; 110 | } 111 | 112 | @Override 113 | public String toString() { 114 | String resultString = ""; 115 | if (id != null) { 116 | resultString += "[" + id + "] "; 117 | } 118 | 119 | if (title != null) { 120 | resultString += title + " "; 121 | } 122 | 123 | if (confidence != null) { 124 | resultString += String.format("(%.1f%%) ", confidence * 100.0f); 125 | } 126 | 127 | if (location != null) { 128 | resultString += location + " "; 129 | } 130 | 131 | return resultString.trim(); 132 | } 133 | } 134 | } 135 | -------------------------------------------------------------------------------- /android/app/src/main/java/org/tensorflow/lite/examples/detection/tracking/MultiBoxTracker.java: -------------------------------------------------------------------------------- 1 | /* Copyright 2019 The TensorFlow Authors. All Rights Reserved. 2 | 3 | Licensed under the Apache License, Version 2.0 (the "License"); 4 | you may not use this file except in compliance with the License. 5 | You may obtain a copy of the License at 6 | 7 | http://www.apache.org/licenses/LICENSE-2.0 8 | 9 | Unless required by applicable law or agreed to in writing, software 10 | distributed under the License is distributed on an "AS IS" BASIS, 11 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | See the License for the specific language governing permissions and 13 | limitations under the License. 14 | ==============================================================================*/ 15 | 16 | package org.tensorflow.lite.examples.detection.tracking; 17 | 18 | import android.content.Context; 19 | import android.graphics.Canvas; 20 | import android.graphics.Color; 21 | import android.graphics.Matrix; 22 | import android.graphics.Paint; 23 | import android.graphics.Paint.Cap; 24 | import android.graphics.Paint.Join; 25 | import android.graphics.Paint.Style; 26 | import android.graphics.RectF; 27 | import android.text.TextUtils; 28 | import android.util.Pair; 29 | import android.util.TypedValue; 30 | import java.util.LinkedList; 31 | import java.util.List; 32 | import java.util.Queue; 33 | import org.tensorflow.lite.examples.detection.env.BorderedText; 34 | import org.tensorflow.lite.examples.detection.env.ImageUtils; 35 | import org.tensorflow.lite.examples.detection.env.Logger; 36 | import org.tensorflow.lite.examples.detection.tflite.Classifier.Recognition; 37 | 38 | /** A tracker that handles non-max suppression and matches existing objects to new detections. */ 39 | public class MultiBoxTracker { 40 | private static final float TEXT_SIZE_DIP = 18; 41 | private static final float MIN_SIZE = 16.0f; 42 | private static final int[] COLORS = { 43 | Color.BLUE, 44 | Color.RED, 45 | Color.GREEN, 46 | Color.YELLOW, 47 | Color.CYAN, 48 | Color.MAGENTA, 49 | Color.WHITE, 50 | Color.parseColor("#55FF55"), 51 | Color.parseColor("#FFA500"), 52 | Color.parseColor("#FF8888"), 53 | Color.parseColor("#AAAAFF"), 54 | Color.parseColor("#FFFFAA"), 55 | Color.parseColor("#55AAAA"), 56 | Color.parseColor("#AA33AA"), 57 | Color.parseColor("#0D0068") 58 | }; 59 | final List> screenRects = new LinkedList>(); 60 | private final Logger logger = new Logger(); 61 | private final Queue availableColors = new LinkedList(); 62 | private final List trackedObjects = new LinkedList(); 63 | private final Paint boxPaint = new Paint(); 64 | private final float textSizePx; 65 | private final BorderedText borderedText; 66 | private Matrix frameToCanvasMatrix; 67 | private int frameWidth; 68 | private int frameHeight; 69 | private int sensorOrientation; 70 | 71 | public MultiBoxTracker(final Context context) { 72 | for (final int color : COLORS) { 73 | availableColors.add(color); 74 | } 75 | 76 | boxPaint.setColor(Color.RED); 77 | boxPaint.setStyle(Style.STROKE); 78 | boxPaint.setStrokeWidth(10.0f); 79 | boxPaint.setStrokeCap(Cap.ROUND); 80 | boxPaint.setStrokeJoin(Join.ROUND); 81 | boxPaint.setStrokeMiter(100); 82 | 83 | textSizePx = 84 | TypedValue.applyDimension( 85 | TypedValue.COMPLEX_UNIT_DIP, TEXT_SIZE_DIP, context.getResources().getDisplayMetrics()); 86 | borderedText = new BorderedText(textSizePx); 87 | } 88 | 89 | public synchronized void setFrameConfiguration( 90 | final int width, final int height, final int sensorOrientation) { 91 | frameWidth = width; 92 | frameHeight = height; 93 | this.sensorOrientation = sensorOrientation; 94 | } 95 | 96 | public synchronized void drawDebug(final Canvas canvas) { 97 | final Paint textPaint = new Paint(); 98 | textPaint.setColor(Color.WHITE); 99 | textPaint.setTextSize(60.0f); 100 | 101 | final Paint boxPaint = new Paint(); 102 | boxPaint.setColor(Color.RED); 103 | boxPaint.setAlpha(200); 104 | boxPaint.setStyle(Style.STROKE); 105 | 106 | for (final Pair detection : screenRects) { 107 | final RectF rect = detection.second; 108 | canvas.drawRect(rect, boxPaint); 109 | canvas.drawText("" + detection.first, rect.left, rect.top, textPaint); 110 | borderedText.drawText(canvas, rect.centerX(), rect.centerY(), "" + detection.first); 111 | } 112 | } 113 | 114 | public synchronized void trackResults(final List results, final long timestamp) { 115 | logger.i("Processing %d results from %d", results.size(), timestamp); 116 | processResults(results); 117 | } 118 | 119 | private Matrix getFrameToCanvasMatrix() { 120 | return frameToCanvasMatrix; 121 | } 122 | 123 | public synchronized void draw(final Canvas canvas) { 124 | final boolean rotated = sensorOrientation % 180 == 90; 125 | final float multiplier = 126 | Math.min( 127 | canvas.getHeight() / (float) (rotated ? frameWidth : frameHeight), 128 | canvas.getWidth() / (float) (rotated ? frameHeight : frameWidth)); 129 | frameToCanvasMatrix = 130 | ImageUtils.getTransformationMatrix( 131 | frameWidth, 132 | frameHeight, 133 | (int) (multiplier * (rotated ? frameHeight : frameWidth)), 134 | (int) (multiplier * (rotated ? frameWidth : frameHeight)), 135 | sensorOrientation, 136 | false); 137 | for (final TrackedRecognition recognition : trackedObjects) { 138 | final RectF trackedPos = new RectF(recognition.location); 139 | 140 | getFrameToCanvasMatrix().mapRect(trackedPos); 141 | boxPaint.setColor(recognition.color); 142 | 143 | float cornerSize = Math.min(trackedPos.width(), trackedPos.height()) / 8.0f; 144 | canvas.drawRoundRect(trackedPos, cornerSize, cornerSize, boxPaint); 145 | 146 | final String labelString = 147 | !TextUtils.isEmpty(recognition.title) 148 | ? String.format("%s %.2f", recognition.title, (100 * recognition.detectionConfidence)) 149 | : String.format("%.2f", (100 * recognition.detectionConfidence)); 150 | // borderedText.drawText(canvas, trackedPos.left + cornerSize, trackedPos.top, 151 | // labelString); 152 | borderedText.drawText( 153 | canvas, trackedPos.left + cornerSize, trackedPos.top, labelString + "%", boxPaint); 154 | } 155 | } 156 | 157 | private void processResults(final List results) { 158 | final List> rectsToTrack = new LinkedList>(); 159 | 160 | screenRects.clear(); 161 | final Matrix rgbFrameToScreen = new Matrix(getFrameToCanvasMatrix()); 162 | 163 | for (final Recognition result : results) { 164 | if (result.getLocation() == null) { 165 | continue; 166 | } 167 | final RectF detectionFrameRect = new RectF(result.getLocation()); 168 | 169 | final RectF detectionScreenRect = new RectF(); 170 | rgbFrameToScreen.mapRect(detectionScreenRect, detectionFrameRect); 171 | 172 | logger.v( 173 | "Result! Frame: " + result.getLocation() + " mapped to screen:" + detectionScreenRect); 174 | 175 | screenRects.add(new Pair(result.getConfidence(), detectionScreenRect)); 176 | 177 | if (detectionFrameRect.width() < MIN_SIZE || detectionFrameRect.height() < MIN_SIZE) { 178 | logger.w("Degenerate rectangle! " + detectionFrameRect); 179 | continue; 180 | } 181 | 182 | rectsToTrack.add(new Pair(result.getConfidence(), result)); 183 | } 184 | 185 | trackedObjects.clear(); 186 | if (rectsToTrack.isEmpty()) { 187 | logger.v("Nothing to track, aborting."); 188 | return; 189 | } 190 | 191 | for (final Pair potential : rectsToTrack) { 192 | final TrackedRecognition trackedRecognition = new TrackedRecognition(); 193 | trackedRecognition.detectionConfidence = potential.first; 194 | trackedRecognition.location = new RectF(potential.second.getLocation()); 195 | trackedRecognition.title = potential.second.getTitle(); 196 | trackedRecognition.color = COLORS[trackedObjects.size()]; 197 | trackedObjects.add(trackedRecognition); 198 | 199 | if (trackedObjects.size() >= COLORS.length) { 200 | break; 201 | } 202 | } 203 | } 204 | 205 | private static class TrackedRecognition { 206 | RectF location; 207 | float detectionConfidence; 208 | int color; 209 | String title; 210 | } 211 | } 212 | -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-hdpi/ic_launcher.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-hdpi/ic_launcher.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-mdpi/ic_launcher.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-mdpi/ic_launcher.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-v24/ic_launcher_foreground.xml: -------------------------------------------------------------------------------- 1 | 7 | 12 | 13 | 19 | 22 | 25 | 26 | 27 | 28 | 34 | 35 | -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-v24/kite.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-v24/kite.jpg -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxhdpi/ic_launcher.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxhdpi/ic_launcher.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxhdpi/icn_chevron_down.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxhdpi/icn_chevron_down.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxhdpi/icn_chevron_up.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxhdpi/icn_chevron_up.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxhdpi/tfl2_logo.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxhdpi/tfl2_logo.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxhdpi/tfl2_logo_dark.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxhdpi/tfl2_logo_dark.png -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxxhdpi/caret.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxxhdpi/caret.jpg -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxxhdpi/chair.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxxhdpi/chair.jpg -------------------------------------------------------------------------------- /android/app/src/main/res/drawable-xxxhdpi/sample_image.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/hunglc007/tensorflow-yolov4-tflite/9f16748aa3f45ff240608da4bd9b1216a29127f5/android/app/src/main/res/drawable-xxxhdpi/sample_image.jpg -------------------------------------------------------------------------------- /android/app/src/main/res/drawable/bottom_sheet_bg.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 7 | 8 | 9 | -------------------------------------------------------------------------------- /android/app/src/main/res/drawable/ic_baseline_add.xml: -------------------------------------------------------------------------------- 1 | 6 | 9 | 10 | -------------------------------------------------------------------------------- /android/app/src/main/res/drawable/ic_baseline_remove.xml: -------------------------------------------------------------------------------- 1 | 6 | 9 | 10 | -------------------------------------------------------------------------------- /android/app/src/main/res/drawable/ic_launcher_background.xml: -------------------------------------------------------------------------------- 1 | 2 | 7 | 10 | 15 | 20 | 25 | 30 | 35 | 40 | 45 | 50 | 55 | 60 | 65 | 70 | 75 | 80 | 85 | 90 | 95 | 100 | 105 | 110 | 115 | 120 | 125 | 130 | 135 | 140 | 145 | 150 | 155 | 160 | 165 | 170 | 171 | -------------------------------------------------------------------------------- /android/app/src/main/res/drawable/rectangle.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 7 | 12 | 13 | -------------------------------------------------------------------------------- /android/app/src/main/res/layout/activity_main.xml: -------------------------------------------------------------------------------- 1 | 2 | 8 | 9 | 10 |