├── README.md ├── android_app └── android_app │ ├── .gitignore │ ├── README.md │ ├── app │ ├── .gitignore │ ├── build.gradle.kts │ ├── proguard-rules.pro │ └── src │ │ ├── androidTest │ │ └── java │ │ │ └── com │ │ │ └── surendramaran │ │ │ └── yolov8tflite │ │ │ └── ExampleInstrumentedTest.kt │ │ ├── main │ │ ├── AndroidManifest.xml │ │ ├── assets │ │ │ ├── labels.txt │ │ │ └── model.tflite │ │ ├── java │ │ │ └── com │ │ │ │ └── surendramaran │ │ │ │ └── yolov8tflite │ │ │ │ ├── BoundingBox.kt │ │ │ │ ├── Constants.kt │ │ │ │ ├── Detector.kt │ │ │ │ ├── MainActivity.kt │ │ │ │ └── OverlayView.kt │ │ └── res │ │ │ ├── drawable-v24 │ │ │ └── ic_launcher_foreground.xml │ │ │ ├── drawable │ │ │ └── ic_launcher_background.xml │ │ │ ├── layout │ │ │ └── activity_main.xml │ │ │ ├── mipmap-anydpi-v26 │ │ │ ├── ic_launcher.xml │ │ │ └── ic_launcher_round.xml │ │ │ ├── mipmap-hdpi │ │ │ ├── ic_launcher.webp │ │ │ └── ic_launcher_round.webp │ │ │ ├── mipmap-mdpi │ │ │ ├── ic_launcher.webp │ │ │ └── ic_launcher_round.webp │ │ │ ├── mipmap-xhdpi │ │ │ ├── ic_launcher.webp │ │ │ └── ic_launcher_round.webp │ │ │ ├── mipmap-xxhdpi │ │ │ ├── ic_launcher.webp │ │ │ └── ic_launcher_round.webp │ │ │ ├── mipmap-xxxhdpi │ │ │ ├── ic_launcher.webp │ │ │ └── ic_launcher_round.webp │ │ │ ├── values-night │ │ │ └── themes.xml │ │ │ ├── values │ │ │ ├── colors.xml │ │ │ ├── strings.xml │ │ │ └── themes.xml │ │ │ └── xml │ │ │ ├── backup_rules.xml │ │ │ └── data_extraction_rules.xml │ │ └── test │ │ └── java │ │ └── com │ │ └── surendramaran │ │ └── yolov8tflite │ │ └── ExampleUnitTest.kt │ ├── build.gradle.kts │ ├── gradle.properties │ ├── gradle │ └── wrapper │ │ ├── gradle-wrapper.jar │ │ └── gradle-wrapper.properties │ ├── gradlew │ ├── gradlew.bat │ └── settings.gradle.kts ├── test_images └── b.jpg ├── train_export_yolov8_model.ipynb └── update.py /README.md: -------------------------------------------------------------------------------- 1 | # Object detection app using YOLOv8 and Android 2 | 3 | ### Check the video to understand the code: https://youtu.be/dl7rCmvIyiI 4 | 5 | ## Step 1 (Train and export Object detection model): 6 | - git clone https://github.com/AarohiSingla/Object-Detection-Android-App.git 7 | 8 | - Train yolov8 model on custom dataset and export it in .tflite format. (Check train_export_yolov8_model.ipynb ) 9 | 10 | ## Step 2 (Object detection android app setup): 11 | - Open android_app folder. 12 | 13 | - Put your .tflite model and .txt label file inside the assets folder. You can find assets folder at this location: android_app\android_app\app\src\main\assets 14 | 15 | - Rename paths of your model and labels file in Constants.kt file. You can find Constants.kt at this location: android_app\android_app\app\src\main\java\com\surendramaran\yolov8tflite 16 | 17 | - Download and install Android Studio from the official website (https://developer.android.com/studio) 18 | 19 | - Once installed, open Android Studio from your applications menu. 20 | 21 | - When Android Studio opens, you'll see a welcome screen. Here, you'll find options to create a new project, open an existing project, or check out project from version control.Since you already have a project, click on "Open an existing Android Studio project". 22 | 23 | - Navigate to the directory where your project is located and select the project's root folder. 24 | 25 | - Build and Run 26 | ![SAD1K9IGLAXS_jpg rf d634b3e06bebf7dd7d15b3e699e359d2](https://github.com/AarohiSingla/Object-Detection-Android-App/assets/60029146/08610d96-54e5-4425-85f9-c92e14f87a14) 27 | 28 | 29 | 30 | 31 | ## Credits 32 | 33 | This project includes the andrroid app code from the following repository: 34 | 35 | - [Original Repository Name](https://github.com/surendramaran/YOLOv8-TfLite-Object-Detector) 36 | 37 | Special thanks to [link-to-original-author-profile](https://github.com/surendramaran) for their contribution. 38 | -------------------------------------------------------------------------------- /android_app/android_app/.gitignore: -------------------------------------------------------------------------------- 1 | *.iml 2 | .gradle 3 | /local.properties 4 | /.idea/caches 5 | /.idea/libraries 6 | /.idea/modules.xml 7 | /.idea/workspace.xml 8 | /.idea/navEditor.xml 9 | /.idea/assetWizardSettings.xml 10 | .DS_Store 11 | /build 12 | /captures 13 | .externalNativeBuild 14 | .cxx 15 | local.properties 16 | -------------------------------------------------------------------------------- /android_app/android_app/README.md: -------------------------------------------------------------------------------- 1 | ## YOLOv8 Live Object Detection Android Application 2 | 3 | ### Description 4 | This Android application is designed to perform live object detection using the YOLOv8 machine learning model. YOLOv8 (You Only Look Once version 8) is known for its real-time object detection capabilities, and this app brings that functionality to Android devices. 5 | 6 | ### Getting Started 7 | To use this repository for any custom YOLOv8 Object detection model, follow these steps: 8 | 1. Clone this repository to your local machine using `git clone https://github.com/surendramaran/YOLOv8-TfLite-Object-Detector`. 9 | 2. Put your .tflite model and .txt label file inside the assets folder 10 | 3. Rename paths of your model and labels file in Constants.kt file 11 | 4. **Build and Run:** 12 | 13 | 14 | -------------------------------------------------------------------------------- /android_app/android_app/app/.gitignore: -------------------------------------------------------------------------------- 1 | /build -------------------------------------------------------------------------------- /android_app/android_app/app/build.gradle.kts: -------------------------------------------------------------------------------- 1 | plugins { 2 | id("com.android.application") 3 | id("org.jetbrains.kotlin.android") 4 | } 5 | 6 | android { 7 | namespace = "com.surendramaran.yolov8tflite" 8 | compileSdk = 34 9 | 10 | defaultConfig { 11 | applicationId = "com.surendramaran.yolov8tflite" 12 | minSdk = 21 13 | targetSdk = 34 14 | versionCode = 1 15 | versionName = "1.0" 16 | 17 | testInstrumentationRunner = "androidx.test.runner.AndroidJUnitRunner" 18 | } 19 | 20 | buildTypes { 21 | release { 22 | isMinifyEnabled = false 23 | proguardFiles( 24 | getDefaultProguardFile("proguard-android-optimize.txt"), 25 | "proguard-rules.pro" 26 | ) 27 | } 28 | } 29 | compileOptions { 30 | sourceCompatibility = JavaVersion.VERSION_1_8 31 | targetCompatibility = JavaVersion.VERSION_1_8 32 | } 33 | kotlinOptions { 34 | jvmTarget = "1.8" 35 | } 36 | 37 | buildFeatures { 38 | viewBinding = true 39 | } 40 | } 41 | 42 | dependencies { 43 | 44 | implementation("androidx.core:core-ktx:1.12.0") 45 | implementation("androidx.appcompat:appcompat:1.6.1") 46 | implementation("com.google.android.material:material:1.11.0") 47 | implementation("androidx.constraintlayout:constraintlayout:2.1.4") 48 | testImplementation("junit:junit:4.13.2") 49 | androidTestImplementation("androidx.test.ext:junit:1.1.5") 50 | androidTestImplementation("androidx.test.espresso:espresso-core:3.5.1") 51 | 52 | val cameraxVersion = "1.4.0-alpha03" 53 | implementation("androidx.camera:camera-camera2:${cameraxVersion}") 54 | implementation("androidx.camera:camera-lifecycle:${cameraxVersion}") 55 | implementation("androidx.camera:camera-view:${cameraxVersion}") 56 | 57 | implementation("org.tensorflow:tensorflow-lite:2.14.0") 58 | implementation("org.tensorflow:tensorflow-lite-support:0.4.4") 59 | } -------------------------------------------------------------------------------- /android_app/android_app/app/proguard-rules.pro: -------------------------------------------------------------------------------- 1 | # Add project specific ProGuard rules here. 2 | # You can control the set of applied configuration files using the 3 | # proguardFiles setting in build.gradle. 4 | # 5 | # For more details, see 6 | # http://developer.android.com/guide/developing/tools/proguard.html 7 | 8 | # If your project uses WebView with JS, uncomment the following 9 | # and specify the fully qualified class name to the JavaScript interface 10 | # class: 11 | #-keepclassmembers class fqcn.of.javascript.interface.for.webview { 12 | # public *; 13 | #} 14 | 15 | # Uncomment this to preserve the line number information for 16 | # debugging stack traces. 17 | #-keepattributes SourceFile,LineNumberTable 18 | 19 | # If you keep the line number information, uncomment this to 20 | # hide the original source file name. 21 | #-renamesourcefileattribute SourceFile -------------------------------------------------------------------------------- /android_app/android_app/app/src/androidTest/java/com/surendramaran/yolov8tflite/ExampleInstrumentedTest.kt: -------------------------------------------------------------------------------- 1 | package com.surendramaran.yolov8tflite 2 | 3 | import androidx.test.platform.app.InstrumentationRegistry 4 | import androidx.test.ext.junit.runners.AndroidJUnit4 5 | 6 | import org.junit.Test 7 | import org.junit.runner.RunWith 8 | 9 | import org.junit.Assert.* 10 | 11 | /** 12 | * Instrumented test, which will execute on an Android device. 13 | * 14 | * See [testing documentation](http://d.android.com/tools/testing). 15 | */ 16 | @RunWith(AndroidJUnit4::class) 17 | class ExampleInstrumentedTest { 18 | @Test 19 | fun useAppContext() { 20 | // Context of the app under test. 21 | val appContext = InstrumentationRegistry.getInstrumentation().targetContext 22 | assertEquals("com.surendramaran.yolov8tflite", appContext.packageName) 23 | } 24 | } -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/AndroidManifest.xml: -------------------------------------------------------------------------------- 1 | 2 | 4 | 5 | 6 | 7 | 8 | 18 | 21 | 22 | 23 | 24 | 25 | 26 | 27 | 28 | 29 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/assets/labels.txt: -------------------------------------------------------------------------------- 1 | fork 2 | knife 3 | plate 4 | spoon -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/assets/model.tflite: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/assets/model.tflite -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/java/com/surendramaran/yolov8tflite/BoundingBox.kt: -------------------------------------------------------------------------------- 1 | package com.surendramaran.yolov8tflite 2 | 3 | data class BoundingBox( 4 | val x1: Float, 5 | val y1: Float, 6 | val x2: Float, 7 | val y2: Float, 8 | val cx: Float, 9 | val cy: Float, 10 | val w: Float, 11 | val h: Float, 12 | val cnf: Float, 13 | val cls: Int, 14 | val clsName: String 15 | ) -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/java/com/surendramaran/yolov8tflite/Constants.kt: -------------------------------------------------------------------------------- 1 | package com.surendramaran.yolov8tflite 2 | 3 | object Constants { 4 | const val MODEL_PATH = "model.tflite" 5 | const val LABELS_PATH = "labels.txt" 6 | } 7 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/java/com/surendramaran/yolov8tflite/Detector.kt: -------------------------------------------------------------------------------- 1 | package com.surendramaran.yolov8tflite 2 | 3 | import android.content.Context 4 | import android.graphics.Bitmap 5 | import android.os.SystemClock 6 | import org.tensorflow.lite.DataType 7 | import org.tensorflow.lite.Interpreter 8 | import org.tensorflow.lite.support.common.FileUtil 9 | import org.tensorflow.lite.support.common.ops.CastOp 10 | import org.tensorflow.lite.support.common.ops.NormalizeOp 11 | import org.tensorflow.lite.support.image.ImageProcessor 12 | import org.tensorflow.lite.support.image.TensorImage 13 | import org.tensorflow.lite.support.tensorbuffer.TensorBuffer 14 | import java.io.BufferedReader 15 | import java.io.IOException 16 | import java.io.InputStream 17 | import java.io.InputStreamReader 18 | 19 | class Detector( 20 | private val context: Context, 21 | private val modelPath: String, 22 | private val labelPath: String, 23 | private val detectorListener: DetectorListener 24 | ) { 25 | 26 | private var interpreter: Interpreter? = null 27 | private var labels = mutableListOf() 28 | 29 | private var tensorWidth = 0 30 | private var tensorHeight = 0 31 | private var numChannel = 0 32 | private var numElements = 0 33 | 34 | private val imageProcessor = ImageProcessor.Builder() 35 | .add(NormalizeOp(INPUT_MEAN, INPUT_STANDARD_DEVIATION)) 36 | .add(CastOp(INPUT_IMAGE_TYPE)) 37 | .build() 38 | 39 | fun setup() { 40 | val model = FileUtil.loadMappedFile(context, modelPath) 41 | val options = Interpreter.Options() 42 | options.numThreads = 4 43 | interpreter = Interpreter(model, options) 44 | 45 | val inputShape = interpreter?.getInputTensor(0)?.shape() ?: return 46 | val outputShape = interpreter?.getOutputTensor(0)?.shape() ?: return 47 | 48 | tensorWidth = inputShape[1] 49 | tensorHeight = inputShape[2] 50 | numChannel = outputShape[1] 51 | numElements = outputShape[2] 52 | 53 | try { 54 | val inputStream: InputStream = context.assets.open(labelPath) 55 | val reader = BufferedReader(InputStreamReader(inputStream)) 56 | 57 | var line: String? = reader.readLine() 58 | while (line != null && line != "") { 59 | labels.add(line) 60 | line = reader.readLine() 61 | } 62 | 63 | reader.close() 64 | inputStream.close() 65 | } catch (e: IOException) { 66 | e.printStackTrace() 67 | } 68 | } 69 | 70 | fun clear() { 71 | interpreter?.close() 72 | interpreter = null 73 | } 74 | 75 | fun detect(frame: Bitmap) { 76 | interpreter ?: return 77 | if (tensorWidth == 0) return 78 | if (tensorHeight == 0) return 79 | if (numChannel == 0) return 80 | if (numElements == 0) return 81 | 82 | var inferenceTime = SystemClock.uptimeMillis() 83 | 84 | val resizedBitmap = Bitmap.createScaledBitmap(frame, tensorWidth, tensorHeight, false) 85 | 86 | val tensorImage = TensorImage(DataType.FLOAT32) 87 | tensorImage.load(resizedBitmap) 88 | val processedImage = imageProcessor.process(tensorImage) 89 | val imageBuffer = processedImage.buffer 90 | 91 | val output = TensorBuffer.createFixedSize(intArrayOf(1 , numChannel, numElements), OUTPUT_IMAGE_TYPE) 92 | interpreter?.run(imageBuffer, output.buffer) 93 | 94 | 95 | val bestBoxes = bestBox(output.floatArray) 96 | inferenceTime = SystemClock.uptimeMillis() - inferenceTime 97 | 98 | 99 | if (bestBoxes == null) { 100 | detectorListener.onEmptyDetect() 101 | return 102 | } 103 | 104 | detectorListener.onDetect(bestBoxes, inferenceTime) 105 | } 106 | 107 | private fun bestBox(array: FloatArray) : List? { 108 | 109 | val boundingBoxes = mutableListOf() 110 | 111 | for (c in 0 until numElements) { 112 | var maxConf = -1.0f 113 | var maxIdx = -1 114 | var j = 4 115 | var arrayIdx = c + numElements * j 116 | while (j < numChannel){ 117 | if (array[arrayIdx] > maxConf) { 118 | maxConf = array[arrayIdx] 119 | maxIdx = j - 4 120 | } 121 | j++ 122 | arrayIdx += numElements 123 | } 124 | 125 | if (maxConf > CONFIDENCE_THRESHOLD) { 126 | val clsName = labels[maxIdx] 127 | val cx = array[c] // 0 128 | val cy = array[c + numElements] // 1 129 | val w = array[c + numElements * 2] 130 | val h = array[c + numElements * 3] 131 | val x1 = cx - (w/2F) 132 | val y1 = cy - (h/2F) 133 | val x2 = cx + (w/2F) 134 | val y2 = cy + (h/2F) 135 | if (x1 < 0F || x1 > 1F) continue 136 | if (y1 < 0F || y1 > 1F) continue 137 | if (x2 < 0F || x2 > 1F) continue 138 | if (y2 < 0F || y2 > 1F) continue 139 | 140 | boundingBoxes.add( 141 | BoundingBox( 142 | x1 = x1, y1 = y1, x2 = x2, y2 = y2, 143 | cx = cx, cy = cy, w = w, h = h, 144 | cnf = maxConf, cls = maxIdx, clsName = clsName 145 | ) 146 | ) 147 | } 148 | } 149 | 150 | if (boundingBoxes.isEmpty()) return null 151 | 152 | return applyNMS(boundingBoxes) 153 | } 154 | 155 | private fun applyNMS(boxes: List) : MutableList { 156 | val sortedBoxes = boxes.sortedByDescending { it.cnf }.toMutableList() 157 | val selectedBoxes = mutableListOf() 158 | 159 | while(sortedBoxes.isNotEmpty()) { 160 | val first = sortedBoxes.first() 161 | selectedBoxes.add(first) 162 | sortedBoxes.remove(first) 163 | 164 | val iterator = sortedBoxes.iterator() 165 | while (iterator.hasNext()) { 166 | val nextBox = iterator.next() 167 | val iou = calculateIoU(first, nextBox) 168 | if (iou >= IOU_THRESHOLD) { 169 | iterator.remove() 170 | } 171 | } 172 | } 173 | 174 | return selectedBoxes 175 | } 176 | 177 | private fun calculateIoU(box1: BoundingBox, box2: BoundingBox): Float { 178 | val x1 = maxOf(box1.x1, box2.x1) 179 | val y1 = maxOf(box1.y1, box2.y1) 180 | val x2 = minOf(box1.x2, box2.x2) 181 | val y2 = minOf(box1.y2, box2.y2) 182 | val intersectionArea = maxOf(0F, x2 - x1) * maxOf(0F, y2 - y1) 183 | val box1Area = box1.w * box1.h 184 | val box2Area = box2.w * box2.h 185 | return intersectionArea / (box1Area + box2Area - intersectionArea) 186 | } 187 | 188 | interface DetectorListener { 189 | fun onEmptyDetect() 190 | fun onDetect(boundingBoxes: List, inferenceTime: Long) 191 | } 192 | 193 | companion object { 194 | private const val INPUT_MEAN = 0f 195 | private const val INPUT_STANDARD_DEVIATION = 255f 196 | private val INPUT_IMAGE_TYPE = DataType.FLOAT32 197 | private val OUTPUT_IMAGE_TYPE = DataType.FLOAT32 198 | private const val CONFIDENCE_THRESHOLD = 0.3F 199 | private const val IOU_THRESHOLD = 0.5F 200 | } 201 | } -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/java/com/surendramaran/yolov8tflite/MainActivity.kt: -------------------------------------------------------------------------------- 1 | package com.surendramaran.yolov8tflite 2 | 3 | import android.Manifest 4 | import android.content.pm.PackageManager 5 | import android.graphics.Bitmap 6 | import android.graphics.Matrix 7 | import android.os.Bundle 8 | import android.util.Log 9 | import androidx.activity.result.contract.ActivityResultContracts 10 | import androidx.appcompat.app.AppCompatActivity 11 | import androidx.camera.core.AspectRatio 12 | import androidx.camera.core.Camera 13 | import androidx.camera.core.CameraSelector 14 | import androidx.camera.core.ImageAnalysis 15 | import androidx.camera.core.Preview 16 | import androidx.camera.lifecycle.ProcessCameraProvider 17 | import androidx.core.app.ActivityCompat 18 | import androidx.core.content.ContextCompat 19 | import com.surendramaran.yolov8tflite.Constants.LABELS_PATH 20 | import com.surendramaran.yolov8tflite.Constants.MODEL_PATH 21 | import com.surendramaran.yolov8tflite.databinding.ActivityMainBinding 22 | import java.util.concurrent.ExecutorService 23 | import java.util.concurrent.Executors 24 | 25 | class MainActivity : AppCompatActivity(), Detector.DetectorListener { 26 | private lateinit var binding: ActivityMainBinding 27 | private val isFrontCamera = false 28 | 29 | private var preview: Preview? = null 30 | private var imageAnalyzer: ImageAnalysis? = null 31 | private var camera: Camera? = null 32 | private var cameraProvider: ProcessCameraProvider? = null 33 | private lateinit var detector: Detector 34 | 35 | private lateinit var cameraExecutor: ExecutorService 36 | 37 | override fun onCreate(savedInstanceState: Bundle?) { 38 | super.onCreate(savedInstanceState) 39 | binding = ActivityMainBinding.inflate(layoutInflater) 40 | setContentView(binding.root) 41 | 42 | detector = Detector(baseContext, MODEL_PATH, LABELS_PATH, this) 43 | detector.setup() 44 | 45 | if (allPermissionsGranted()) { 46 | startCamera() 47 | } else { 48 | ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS) 49 | } 50 | 51 | cameraExecutor = Executors.newSingleThreadExecutor() 52 | } 53 | 54 | private fun startCamera() { 55 | val cameraProviderFuture = ProcessCameraProvider.getInstance(this) 56 | cameraProviderFuture.addListener({ 57 | cameraProvider = cameraProviderFuture.get() 58 | bindCameraUseCases() 59 | }, ContextCompat.getMainExecutor(this)) 60 | } 61 | 62 | private fun bindCameraUseCases() { 63 | val cameraProvider = cameraProvider ?: throw IllegalStateException("Camera initialization failed.") 64 | 65 | val rotation = binding.viewFinder.display.rotation 66 | 67 | val cameraSelector = CameraSelector 68 | .Builder() 69 | .requireLensFacing(CameraSelector.LENS_FACING_BACK) 70 | .build() 71 | 72 | preview = Preview.Builder() 73 | .setTargetAspectRatio(AspectRatio.RATIO_4_3) 74 | .setTargetRotation(rotation) 75 | .build() 76 | 77 | imageAnalyzer = ImageAnalysis.Builder() 78 | .setTargetAspectRatio(AspectRatio.RATIO_4_3) 79 | .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST) 80 | .setTargetRotation(binding.viewFinder.display.rotation) 81 | .setOutputImageFormat(ImageAnalysis.OUTPUT_IMAGE_FORMAT_RGBA_8888) 82 | .build() 83 | 84 | imageAnalyzer?.setAnalyzer(cameraExecutor) { imageProxy -> 85 | val bitmapBuffer = 86 | Bitmap.createBitmap( 87 | imageProxy.width, 88 | imageProxy.height, 89 | Bitmap.Config.ARGB_8888 90 | ) 91 | imageProxy.use { bitmapBuffer.copyPixelsFromBuffer(imageProxy.planes[0].buffer) } 92 | imageProxy.close() 93 | 94 | val matrix = Matrix().apply { 95 | postRotate(imageProxy.imageInfo.rotationDegrees.toFloat()) 96 | 97 | if (isFrontCamera) { 98 | postScale( 99 | -1f, 100 | 1f, 101 | imageProxy.width.toFloat(), 102 | imageProxy.height.toFloat() 103 | ) 104 | } 105 | } 106 | 107 | val rotatedBitmap = Bitmap.createBitmap( 108 | bitmapBuffer, 0, 0, bitmapBuffer.width, bitmapBuffer.height, 109 | matrix, true 110 | ) 111 | 112 | detector.detect(rotatedBitmap) 113 | } 114 | 115 | cameraProvider.unbindAll() 116 | 117 | try { 118 | camera = cameraProvider.bindToLifecycle( 119 | this, 120 | cameraSelector, 121 | preview, 122 | imageAnalyzer 123 | ) 124 | 125 | preview?.setSurfaceProvider(binding.viewFinder.surfaceProvider) 126 | } catch(exc: Exception) { 127 | Log.e(TAG, "Use case binding failed", exc) 128 | } 129 | } 130 | 131 | private fun allPermissionsGranted() = REQUIRED_PERMISSIONS.all { 132 | ContextCompat.checkSelfPermission(baseContext, it) == PackageManager.PERMISSION_GRANTED 133 | } 134 | 135 | private val requestPermissionLauncher = registerForActivityResult( 136 | ActivityResultContracts.RequestMultiplePermissions()) { 137 | if (it[Manifest.permission.CAMERA] == true) { startCamera() } 138 | } 139 | 140 | override fun onDestroy() { 141 | super.onDestroy() 142 | detector.clear() 143 | cameraExecutor.shutdown() 144 | } 145 | 146 | override fun onResume() { 147 | super.onResume() 148 | if (allPermissionsGranted()){ 149 | startCamera() 150 | } else { 151 | requestPermissionLauncher.launch(REQUIRED_PERMISSIONS) 152 | } 153 | } 154 | 155 | companion object { 156 | private const val TAG = "Camera" 157 | private const val REQUEST_CODE_PERMISSIONS = 10 158 | private val REQUIRED_PERMISSIONS = mutableListOf ( 159 | Manifest.permission.CAMERA 160 | ).toTypedArray() 161 | } 162 | 163 | override fun onEmptyDetect() { 164 | binding.overlay.invalidate() 165 | } 166 | 167 | override fun onDetect(boundingBoxes: List, inferenceTime: Long) { 168 | runOnUiThread { 169 | binding.inferenceTime.text = "${inferenceTime}ms" 170 | binding.overlay.apply { 171 | setResults(boundingBoxes) 172 | invalidate() 173 | } 174 | } 175 | } 176 | } 177 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/java/com/surendramaran/yolov8tflite/OverlayView.kt: -------------------------------------------------------------------------------- 1 | package com.surendramaran.yolov8tflite 2 | 3 | import android.content.Context 4 | import android.graphics.Canvas 5 | import android.graphics.Color 6 | import android.graphics.Paint 7 | import android.graphics.Rect 8 | import android.graphics.RectF 9 | import android.util.AttributeSet 10 | import android.view.View 11 | import androidx.core.content.ContextCompat 12 | import java.util.LinkedList 13 | import kotlin.math.max 14 | 15 | class OverlayView(context: Context?, attrs: AttributeSet?) : View(context, attrs) { 16 | 17 | private var results = listOf() 18 | private var boxPaint = Paint() 19 | private var textBackgroundPaint = Paint() 20 | private var textPaint = Paint() 21 | 22 | private var bounds = Rect() 23 | 24 | init { 25 | initPaints() 26 | } 27 | 28 | fun clear() { 29 | textPaint.reset() 30 | textBackgroundPaint.reset() 31 | boxPaint.reset() 32 | invalidate() 33 | initPaints() 34 | } 35 | 36 | private fun initPaints() { 37 | textBackgroundPaint.color = Color.BLACK 38 | textBackgroundPaint.style = Paint.Style.FILL 39 | textBackgroundPaint.textSize = 50f 40 | 41 | textPaint.color = Color.WHITE 42 | textPaint.style = Paint.Style.FILL 43 | textPaint.textSize = 50f 44 | 45 | boxPaint.color = ContextCompat.getColor(context!!, R.color.bounding_box_color) 46 | boxPaint.strokeWidth = 8F 47 | boxPaint.style = Paint.Style.STROKE 48 | } 49 | 50 | override fun draw(canvas: Canvas) { 51 | super.draw(canvas) 52 | 53 | results.forEach { 54 | val left = it.x1 * width 55 | val top = it.y1 * height 56 | val right = it.x2 * width 57 | val bottom = it.y2 * height 58 | 59 | canvas.drawRect(left, top, right, bottom, boxPaint) 60 | val drawableText = it.clsName 61 | 62 | textBackgroundPaint.getTextBounds(drawableText, 0, drawableText.length, bounds) 63 | val textWidth = bounds.width() 64 | val textHeight = bounds.height() 65 | canvas.drawRect( 66 | left, 67 | top, 68 | left + textWidth + BOUNDING_RECT_TEXT_PADDING, 69 | top + textHeight + BOUNDING_RECT_TEXT_PADDING, 70 | textBackgroundPaint 71 | ) 72 | canvas.drawText(drawableText, left, top + bounds.height(), textPaint) 73 | 74 | } 75 | } 76 | 77 | fun setResults(boundingBoxes: List) { 78 | results = boundingBoxes 79 | invalidate() 80 | } 81 | 82 | companion object { 83 | private const val BOUNDING_RECT_TEXT_PADDING = 8 84 | } 85 | } -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/drawable-v24/ic_launcher_foreground.xml: -------------------------------------------------------------------------------- 1 | 7 | 8 | 9 | 15 | 18 | 21 | 22 | 23 | 24 | 30 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/drawable/ic_launcher_background.xml: -------------------------------------------------------------------------------- 1 | 2 | 7 | 10 | 15 | 20 | 25 | 30 | 35 | 40 | 45 | 50 | 55 | 60 | 65 | 70 | 75 | 80 | 85 | 90 | 95 | 100 | 105 | 110 | 115 | 120 | 125 | 130 | 135 | 140 | 145 | 150 | 155 | 160 | 165 | 170 | 171 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/layout/activity_main.xml: -------------------------------------------------------------------------------- 1 | 2 | 9 | 10 | 22 | 23 | 35 | 36 | 45 | 46 | 47 | 48 | 49 | 50 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-anydpi-v26/ic_launcher.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-anydpi-v26/ic_launcher_round.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-hdpi/ic_launcher.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-hdpi/ic_launcher.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-hdpi/ic_launcher_round.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-hdpi/ic_launcher_round.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-mdpi/ic_launcher.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-mdpi/ic_launcher.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-mdpi/ic_launcher_round.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-mdpi/ic_launcher_round.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-xhdpi/ic_launcher.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-xhdpi/ic_launcher.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-xhdpi/ic_launcher_round.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-xhdpi/ic_launcher_round.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-xxhdpi/ic_launcher.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-xxhdpi/ic_launcher.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-xxhdpi/ic_launcher_round.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-xxhdpi/ic_launcher_round.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-xxxhdpi/ic_launcher.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-xxxhdpi/ic_launcher.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/mipmap-xxxhdpi/ic_launcher_round.webp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AarohiSingla/Object-Detection-Android-App/969ba21cd6c50dbce911b3145225e0919e554f6d/android_app/android_app/app/src/main/res/mipmap-xxxhdpi/ic_launcher_round.webp -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/values-night/themes.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 7 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/values/colors.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | #FF000000 4 | #FFFFFFFF 5 | #234567 6 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/values/strings.xml: -------------------------------------------------------------------------------- 1 | 2 | YOLOv8 TfLite 3 | -------------------------------------------------------------------------------- /android_app/android_app/app/src/main/res/values/themes.xml: -------------------------------------------------------------------------------- 1 | 2 | 3 | 7 | 8 |