In this tutorial, we’ll be integrating OpenCV in our Android Application. We have already discussed the basics of CameraX here. So today, we’ll see how to run OpenCV image processing on live camera feed.
Android CameraX
CameraX which comes with Android Jetpack is all about use cases. Basically, a camera can be built using three core use cases – Preview, Analyse, Capture.
Having already done preview and capture previously, today our main focus is on Analysis.
Hence, we’ll be using OpenCV to analyze frames and process them in real-time.
What is OpenCV?
OpenCV is a computer vision libraries which contains more than 2000 algorithms related to image processing. It has been written in C++.
Thankfully, a lot of high-level stuff in OpenCV can be done in Java. Besides, we can always use the JNI interface to communicate between Java and C++.
We can download and import the OpenCV SDK from their official GitHub repository. It’s a pretty big module. Due to size and project space constraints, we’ll be using a Gradle dependency library shown below:
1 2 3 |
implementation 'com.quickbirdstudios:opencv:3.4.1' |
Did you know?Android applications that use OpenCV modules have a large APK size.
In the following section using CameraX and OpenCV, we’ll convert the color spaces to give the camera a totally different outlook.
Android OpenCV Project Structure
Android Camerax Opencv Project Structure
Android OpenCV Code
The code for the activity_main.xml layout is given below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 |
<?xml version="1.0" encoding="utf-8"?> <androidx.constraintlayout.widget.ConstraintLayout xmlns:android="https://schemas.android.com/apk/res/android" xmlns:app="https://schemas.android.com/apk/res-auto" android:layout_width="match_parent" android:layout_height="match_parent"> <TextureView android:id="@+id/textureView" android:layout_width="match_parent" android:layout_height="wrap_content" android:visibility="visible" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintLeft_toLeftOf="parent" app:layout_constraintRight_toRightOf="parent" app:layout_constraintTop_toTopOf="parent" /> <ImageView android:id="@+id/ivBitmap" android:layout_width="match_parent" android:layout_height="match_parent" android:layout_gravity="center" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintLeft_toLeftOf="parent" app:layout_constraintRight_toRightOf="parent" app:layout_constraintTop_toTopOf="parent" /> <com.google.android.material.floatingactionbutton.FloatingActionButton android:id="@+id/btnCapture" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_alignParentBottom="true" android:layout_centerHorizontal="true" android:layout_marginBottom="24dp" app:backgroundTint="@android:color/black" android:src="@drawable/ic_camera" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" /> <LinearLayout android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_marginBottom="24dp" android:orientation="horizontal" android:weightSum="2" android:visibility="gone" android:id="@+id/llBottom" android:gravity="center" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent"> <com.google.android.material.floatingactionbutton.FloatingActionButton android:id="@+id/btnReject" android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="0.5" app:backgroundTint="@color/rejectedRed" android:src="@drawable/ic_close" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" /> <com.google.android.material.floatingactionbutton.FloatingActionButton android:id="@+id/btnAccept" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_weight="0.5" android:layout_marginStart="20dp" android:src="@drawable/ic_check" app:backgroundTint="@color/acceptedGreen" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintStart_toStartOf="parent" /> </LinearLayout> </androidx.constraintlayout.widget.ConstraintLayout> |
The code for the MainActivity.java
is given below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 |
package com.journaldev.androidcameraxopencv; import androidx.annotation.NonNull; import androidx.annotation.Nullable; import androidx.appcompat.app.AppCompatActivity; import androidx.camera.core.CameraX; import androidx.camera.core.ImageAnalysis; import androidx.camera.core.ImageAnalysisConfig; import androidx.camera.core.ImageCapture; import androidx.camera.core.ImageCaptureConfig; import androidx.camera.core.ImageProxy; import androidx.camera.core.Preview; import androidx.camera.core.PreviewConfig; import androidx.core.app.ActivityCompat; import androidx.core.content.ContextCompat; import android.content.pm.PackageManager; import android.graphics.Bitmap; import android.graphics.Matrix; import android.os.Bundle; import android.os.Environment; import android.os.Handler; import android.os.HandlerThread; import android.util.Log; import android.util.Rational; import android.util.Size; import android.view.Menu; import android.view.MenuItem; import android.view.Surface; import android.view.TextureView; import android.view.View; import android.view.ViewGroup; import android.widget.ImageView; import android.widget.LinearLayout; import android.widget.Toast; import com.google.android.material.floatingactionbutton.FloatingActionButton; import org.opencv.android.OpenCVLoader; import org.opencv.android.Utils; import org.opencv.core.Mat; import org.opencv.imgproc.Imgproc; import java.io.File; public class MainActivity extends AppCompatActivity implements View.OnClickListener { private int REQUEST_CODE_PERMISSIONS = 101; private final String[] REQUIRED_PERMISSIONS = new String[]{"android.permission.CAMERA", "android.permission.WRITE_EXTERNAL_STORAGE"}; TextureView textureView; ImageView ivBitmap; LinearLayout llBottom; int currentImageType = Imgproc.COLOR_RGB2GRAY; ImageCapture imageCapture; ImageAnalysis imageAnalysis; Preview preview; FloatingActionButton btnCapture, btnOk, btnCancel; static { if (!OpenCVLoader.initDebug()) Log.d("ERROR", "Unable to load OpenCV"); else Log.d("SUCCESS", "OpenCV loaded"); } @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); btnCapture = findViewById(R.id.btnCapture); btnOk = findViewById(R.id.btnAccept); btnCancel = findViewById(R.id.btnReject); btnOk.setOnClickListener(this); btnCancel.setOnClickListener(this); llBottom = findViewById(R.id.llBottom); textureView = findViewById(R.id.textureView); ivBitmap = findViewById(R.id.ivBitmap); if (allPermissionsGranted()) { startCamera(); } else { ActivityCompat.requestPermissions(this, REQUIRED_PERMISSIONS, REQUEST_CODE_PERMISSIONS); } } private void startCamera() { CameraX.unbindAll(); preview = setPreview(); imageCapture = setImageCapture(); imageAnalysis = setImageAnalysis(); //bind to lifecycle: CameraX.bindToLifecycle(this, preview, imageCapture, imageAnalysis); } private Preview setPreview() { Rational aspectRatio = new Rational(textureView.getWidth(), textureView.getHeight()); Size screen = new Size(textureView.getWidth(), textureView.getHeight()); //size of the screen PreviewConfig pConfig = new PreviewConfig.Builder().setTargetAspectRatio(aspectRatio).setTargetResolution(screen).build(); Preview preview = new Preview(pConfig); preview.setOnPreviewOutputUpdateListener( new Preview.OnPreviewOutputUpdateListener() { @Override public void onUpdated(Preview.PreviewOutput output) { ViewGroup parent = (ViewGroup) textureView.getParent(); parent.removeView(textureView); parent.addView(textureView, 0); textureView.setSurfaceTexture(output.getSurfaceTexture()); updateTransform(); } }); return preview; } private ImageCapture setImageCapture() { ImageCaptureConfig imageCaptureConfig = new ImageCaptureConfig.Builder().setCaptureMode(ImageCapture.CaptureMode.MIN_LATENCY) .setTargetRotation(getWindowManager().getDefaultDisplay().getRotation()).build(); final ImageCapture imgCapture = new ImageCapture(imageCaptureConfig); btnCapture.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { imgCapture.takePicture(new ImageCapture.OnImageCapturedListener() { @Override public void onCaptureSuccess(ImageProxy image, int rotationDegrees) { Bitmap bitmap = textureView.getBitmap(); showAcceptedRejectedButton(true); ivBitmap.setImageBitmap(bitmap); } @Override public void onError(ImageCapture.UseCaseError useCaseError, String message, @Nullable Throwable cause) { super.onError(useCaseError, message, cause); } }); /*File file = new File( Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES), "" + System.currentTimeMillis() + "_JDCameraX.jpg"); imgCapture.takePicture(file, new ImageCapture.OnImageSavedListener() { @Override public void onImageSaved(@NonNull File file) { Bitmap bitmap = textureView.getBitmap(); showAcceptedRejectedButton(true); ivBitmap.setImageBitmap(bitmap); } @Override public void onError(@NonNull ImageCapture.UseCaseError useCaseError, @NonNull String message, @Nullable Throwable cause) { } });*/ } }); return imgCapture; } private ImageAnalysis setImageAnalysis() { // Setup image analysis pipeline that computes average pixel luminance HandlerThread analyzerThread = new HandlerThread("OpenCVAnalysis"); analyzerThread.start(); ImageAnalysisConfig imageAnalysisConfig = new ImageAnalysisConfig.Builder() .setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE) .setCallbackHandler(new Handler(analyzerThread.getLooper())) .setImageQueueDepth(1).build(); ImageAnalysis imageAnalysis = new ImageAnalysis(imageAnalysisConfig); imageAnalysis.setAnalyzer( new ImageAnalysis.Analyzer() { @Override public void analyze(ImageProxy image, int rotationDegrees) { //Analyzing live camera feed begins. final Bitmap bitmap = textureView.getBitmap(); if(bitmap==null) return; Mat mat = new Mat(); Utils.bitmapToMat(bitmap, mat); Imgproc.cvtColor(mat, mat, currentImageType); Utils.matToBitmap(mat, bitmap); runOnUiThread(new Runnable() { @Override public void run() { ivBitmap.setImageBitmap(bitmap); } }); } }); return imageAnalysis; } private void showAcceptedRejectedButton(boolean acceptedRejected) { if (acceptedRejected) { CameraX.unbind(preview, imageAnalysis); llBottom.setVisibility(View.VISIBLE); btnCapture.hide(); textureView.setVisibility(View.GONE); } else { btnCapture.show(); llBottom.setVisibility(View.GONE); textureView.setVisibility(View.VISIBLE); textureView.post(new Runnable() { @Override public void run() { startCamera(); } }); } } private void updateTransform() { Matrix mx = new Matrix(); float w = textureView.getMeasuredWidth(); float h = textureView.getMeasuredHeight(); float cX = w / 2f; float cY = h / 2f; int rotationDgr; int rotation = (int) textureView.getRotation(); switch (rotation) { case Surface.ROTATION_0: rotationDgr = 0; break; case Surface.ROTATION_90: rotationDgr = 90; break; case Surface.ROTATION_180: rotationDgr = 180; break; case Surface.ROTATION_270: rotationDgr = 270; break; default: return; } mx.postRotate((float) rotationDgr, cX, cY); textureView.setTransform(mx); } @Override public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) { if (requestCode == REQUEST_CODE_PERMISSIONS) { if (allPermissionsGranted()) { startCamera(); } else { Toast.makeText(this, "Permissions not granted by the user.", Toast.LENGTH_SHORT).show(); finish(); } } } private boolean allPermissionsGranted() { for (String permission: REQUIRED_PERMISSIONS) { if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) { return false; } } return true; } @Override public boolean onCreateOptionsMenu(Menu menu) { getMenuInflater().inflate(R.menu.menu_main, menu); return true; } @Override public boolean onOptionsItemSelected(MenuItem item) { switch (item.getItemId()) { case R.id.black_white: currentImageType = Imgproc.COLOR_RGB2GRAY; startCamera(); return true; case R.id.hsv: currentImageType = Imgproc.COLOR_RGB2HSV; startCamera(); return true; case R.id.lab: currentImageType = Imgproc.COLOR_RGB2Lab; startCamera(); return true; } return super.onOptionsItemSelected(item); } @Override public void onClick(View v) { switch (v.getId()) { case R.id.btnReject: showAcceptedRejectedButton(false); break; case R.id.btnAccept: File file = new File( Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES), "" + System.currentTimeMillis() + "_JDCameraX.jpg"); imageCapture.takePicture(file, new ImageCapture.OnImageSavedListener() { @Override public void onImageSaved(@NonNull File file) { showAcceptedRejectedButton(false); Toast.makeText(getApplicationContext(), "Image saved successfully in Pictures Folder", Toast.LENGTH_LONG).show(); } @Override public void onError(@NonNull ImageCapture.UseCaseError useCaseError, @NonNull String message, @Nullable Throwable cause) { } }); break; } } } |
In the above code, inside Image Analysis use case, we retrieve the Bitmap from the TextureView.
Utils.bitmapToMat
is used to convert the Bitmap to Mat object. This method is a part of OpenCV android.
Mat
class is basically used to hold the image. It consists of a matrix header and a pointer to the matrix which contains pixels values.
In our image analysis, we convert the mat color space from one type to another use ImgProc.cvtColor
.
Having converted the mat to a different color space, we then convert it a bitmap and show it on the screen in an ImageView.
By default, the image is of RGB type. Using the menu options we can convert the image to types GRAY, LAB, HSV.
Let’s look at the output of the application in action.
Android Camerax Opencv Output New
So we were able to analyze, view the captured frame and optionally save it in our internal storage directory.
That brings an end to this tutorial. You can download the project from the link below or view the full source code in our Github repository.