Today I will show a how to use OpenCV and OpenGL to make your own camera preview.it can be used later for image processing or like in my case in a Augmented Reality app. Some questions,assumptions and info need to be considered.
Why accessing the camera natively?
because Image processing is done with OpenCV which is written in C and C++.so you don't want to get your frames in Java and send them to using JNI which is slow get the result back from it then render the resulting frame on the screen.so what we gonna do is to use the camera,openCV and OpenGL all written in C++.
I heard that's it is not supported for all phones,is that right?
So here is the problem android developers changes their native camera module very frequently.OpenCV people tries to cover all the implemented modules till now and they come up with 7 native libraries to cover the majority of the phones till now.So I guess the answer is yes and no.
My code was tested on Samsung Galaxy Note & I got >30 FPS(Frame Per Second) but due to the camera hardware device you can't get more than 30 FPS.
I am using OpenGLES 1.1 because it's easier maybe not efficient because there is no FBOs (Frame Buffer Object) but actually I din't care so much because I got my target FPS and even more.
No camera controles is made here like taking pictures and adjusting focus,just taking frames and render them with OpenGL.
This topic does't have lots of resources and that made me create a blog and write this post.
So lets begin :D
I am assuming that you have installed OpenCV & NDK and know how to use them.
First of all,You need to have a JNI folder so create one if not there.Then you need to add inside it another folder called build.inside build you need to put openCV include folder and libs folder.Both folders comes with the openCV sdk.
OpenCV_Directory->sdk->native->jni->include
OpenCV_Directory->sdk->native->libs
Now you are ready to go.
I will explain the structure of my code in term of threads.
So we have the main thread(1) that controls that whole app.Then after creating a GLSurface and setting its renderer we get another thread called GLThread(2) that is responsible for drawing the the frames that we grab from the camera and render it on the screen.And the last thread is the Frame-Grabber thread(3) (really slow thread),all what it's doing is just taking frames and store them into the a buffer so lately they can be drawn to the screen.
Here we have the our main Activity class called CameraPreviewer.java
package com.mesai.nativecamera;
import java.util.List;
import org.opencv.android.CameraBridgeViewBase.ListItemAccessor;
import org.opencv.android.NativeCameraView.OpenCvSizeAccessor;
import org.opencv.core.Size;
import org.opencv.highgui.Highgui;
import org.opencv.highgui.VideoCapture;
import android.app.Activity;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.view.Display;
public class CameraPreviewer extends Activity {
GLSurfaceView mView;
@Override protected void onCreate(Bundle icicle) {
super.onCreate(icicle);
Native.loadlibs();
VideoCapture mCamera = new VideoCapture(Highgui.CV_CAP_ANDROID);
java.util.List<Size> sizes = mCamera.getSupportedPreviewSizes();
mCamera.release();
mView = new GLSurfaceView(getApplication()){
@Override
public void onPause() {
super.onPause();
Native.releaseCamera();
}
};
Size size = calculateCameraFrameSize(sizes,new OpenCvSizeAccessor());
mView.setRenderer(new CameraRenderer(this,size));
setContentView(mView);
}
protected Size calculateCameraFrameSize(List supportedSizes,
ListItemAccessor accessor) {
int calcWidth = Integer.MAX_VALUE;
int calcHeight = Integer.MAX_VALUE;
Display display = getWindowManager().getDefaultDisplay();
int maxAllowedWidth = 1024;
int maxAllowedHeight = 1024;
for (Object size : supportedSizes) {
int width = accessor.getWidth(size);
int height = accessor.getHeight(size);
if (width <= maxAllowedWidth && height <= maxAllowedHeight) {
if ( width <= calcWidth
&& width>=(maxAllowedWidth/2)
&&(display.getWidth()%width==0||display.getHeight()%height==0)) {
calcWidth = (int) width;
calcHeight = (int) height;
}
}
}
return new Size(calcWidth, calcHeight);
}
@Override protected void onPause() {
super.onPause();
mView.onPause();
}
@Override protected void onResume() {
super.onResume();
mView.onResume();
}
}
Here we can see that there is some weird lines @23-25.this is because the method getSupportedPreviewSizes() is not supported in the C++ version.And I needed the supported resolutions of the camera so that I can pick one that suits me.After that I create the GLSurface that can be used for video rendering.
Now our Custom Renderer CameraRenderer.java
package com.mesai.nativecamera;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import org.opencv.core.Size;
import android.content.Context;
import android.opengl.GLSurfaceView.Renderer;
public class CameraRenderer implements Renderer {
private Size size;
private Context context;
public CameraRenderer(Context c,Size size) {
super();
context = c;
this.size = size;
}
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Thread.currentThread().setPriority(Thread.MAX_PRIORITY);
Native.initCamera((int)size.width,(int)size.height);
}
public void onDrawFrame(GL10 gl) {
// long startTime = System.currentTimeMillis();
Native.renderBackground();
// long endTime = System.currentTimeMillis();
// if(30-(endTime-startTime)>0){
// try {
// Thread.sleep(30-(endTime-startTime));
// } catch (InterruptedException e) {}
// }
// endTime = System.currentTimeMillis();
//System.out.println(endTime-startTime+" ms");
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
Native.surfaceChanged(width,height,context.getResources().getConfiguration().orientation);
}
}
uncommenting the part @lines 29-38 will make your camera have a steady 30 fps and will print the time needed to draw each frame.(just for debugging)
The renderer is referring to a class called Native in all it's Method
so here it's
package com.mesai.nativecamera;
public class Native {
public static void loadlibs(){
System.loadLibrary("opencv_java");
System.loadLibrary("NativeCamera");
}
public static native void initCamera(int width,int height);
public static native void releaseCamera();
public static native void renderBackground();
public static native void surfaceChanged(int width,int height,int orientation);
}
Ok now for the native part here is the code for CameraRenderer.cpp
#include <jni.h>
#include <GLES/gl.h>
#include <GLES/glext.h>
#include <android/log.h>
#include <opencv2/highgui/highgui.hpp>
#include <opencv/cv.h>
#include <pthread.h>
#include <time.h>
#include <Math.h>
// Utility for logging:
#define LOG_TAG "CAMERA_RENDERER"
#define LOG(...) __android_log_print(ANDROID_LOG_INFO, LOG_TAG, __VA_ARGS__)
GLuint texture;
cv::VideoCapture capture;
cv::Mat buffer[30];
cv::Mat rgbFrame;
cv::Mat inframe;
cv::Mat outframe;
int bufferIndex;
int rgbIndex;
int frameWidth;
int frameHeight;
int screenWidth;
int screenHeight;
int orientation;
pthread_mutex_t FGmutex;
pthread_t frameGrabber;
pthread_attr_t attr;
struct sched_param param;
GLfloat vertices[] = {
-1.0f, -1.0f, 0.0f, // V1 - bottom left
-1.0f, 1.0f, 0.0f, // V2 - top left
1.0f, -1.0f, 0.0f, // V3 - bottom right
1.0f, 1.0f, 0.0f // V4 - top right
};
GLfloat textures[8];
extern "C" {
void drawBackground();
void createTexture();
void destroyTexture();
void *frameRetriever(void*);
JNIEXPORT void JNICALL Java_com_mesai_nativecamera_Native_initCamera(JNIEnv*, jobject,jint width,jint height)
{
LOG("Camera Created");
capture.open(CV_CAP_ANDROID + 0);
capture.set(CV_CAP_PROP_FRAME_WIDTH, width);
capture.set(CV_CAP_PROP_FRAME_HEIGHT, height);
frameWidth =width;
frameHeight = height;
LOG("frameWidth = %d",frameWidth);
LOG("frameHeight = %d",frameHeight);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glShadeModel(GL_SMOOTH);
glClearDepthf(1.0f);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
pthread_attr_t attr;
pthread_attr_init(&attr);
pthread_attr_setdetachstate(&attr, PTHREAD_CREATE_DETACHED);
pthread_attr_setschedpolicy(&attr, SCHED_FIFO);
memset(¶m, 0, sizeof(param));
param.sched_priority = 100;
pthread_attr_setschedparam(&attr, ¶m);
pthread_create(&frameGrabber, &attr, frameRetriever, NULL);
pthread_attr_destroy(&attr);
}
JNIEXPORT void JNICALL Java_com_mesai_nativecamera_Native_surfaceChanged(JNIEnv*, jobject,jint width,jint height,jint orien)
{
LOG("Surface Changed");
glViewport(0, 0, width,height);
if(orien==1) {
screenWidth = width;
screenHeight = height;
orientation = 1;
} else {
screenWidth = height;
screenHeight = width;
orientation = 2;
}
LOG("screenWidth = %d",screenWidth);
LOG("screenHeight = %d",screenHeight);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
float aspect = screenWidth / screenHeight;
float bt = (float) tan(45 / 2);
float lr = bt * aspect;
glFrustumf(-lr * 0.1f, lr * 0.1f, -bt * 0.1f, bt * 0.1f, 0.1f,
100.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glEnable(GL_TEXTURE_2D);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClearDepthf(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
createTexture();
}
JNIEXPORT void JNICALL Java_com_mesai_nativecamera_Native_releaseCamera(JNIEnv*, jobject)
{
LOG("Camera Released");
capture.release();
destroyTexture();
}
void createTexture() {
textures[0] = ((1024.0f-frameWidth*1.0f)/2.0f)/1024.0f;
textures[1] = ((1024.0f-frameHeight*1.0f)/2.0f)/1024.0f + (frameHeight*1.0f/1024.0f);
textures[2] = ((1024.0f-frameWidth*1.0f)/2.0f)/1024.0f + (frameWidth*1.0f/1024.0f);
textures[3] = ((1024.0f-frameHeight*1.0f)/2.0f)/1024.0f + (frameHeight*1.0f/1024.0f);
textures[4] = ((1024.0f-frameWidth*1.0f)/2.0f)/1024.0f;
textures[5] = ((1024.0f-frameHeight*1.0f)/2.0f)/1024.0f;
textures[6] = ((1024.0f-frameWidth*1.0f)/2.0f)/1024.0f + (frameWidth*1.0f/1024.0f);
textures[7] = ((1024.0f-frameHeight*1.0f)/2.0f)/1024.0f;
LOG("Texture Created");
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 1024,1024, 0, GL_RGB,
GL_UNSIGNED_SHORT_5_6_5, NULL);
glBindTexture(GL_TEXTURE_2D, 0);
}
void destroyTexture() {
LOG("Texture destroyed");
glDeleteTextures(1, &texture);
}
JNIEXPORT void JNICALL Java_com_mesai_nativecamera_Native_renderBackground(
JNIEnv*, jobject) {
drawBackground();
}
void drawBackground() {
glClear (GL_COLOR_BUFFER_BIT);
glBindTexture(GL_TEXTURE_2D, texture);
if(bufferIndex>0){
pthread_mutex_lock(&FGmutex);
cvtColor(buffer[(bufferIndex - 1) % 30], outframe, CV_BGR2BGR565);
pthread_mutex_unlock(&FGmutex);
cv::flip(outframe, rgbFrame, 1);
if (texture != 0)
glTexSubImage2D(GL_TEXTURE_2D, 0, (1024-frameWidth)/2, (1024-frameHeight)/2, frameWidth, frameHeight,
GL_RGB, GL_UNSIGNED_SHORT_5_6_5, rgbFrame.ptr());
}
glEnableClientState (GL_VERTEX_ARRAY);
glEnableClientState (GL_TEXTURE_COORD_ARRAY);
glLoadIdentity();
if(orientation!=1){
glRotatef( 90,0,0,1);
}
// Set the face rotation
glFrontFace (GL_CW);
// Point to our vertex buffer
glVertexPointer(3, GL_FLOAT, 0, vertices);
glTexCoordPointer(2, GL_FLOAT, 0, textures);
// Draw the vertices as triangle strip
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
//Disable the client state before leaving
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
}
void *frameRetriever(void*) {
while (capture.isOpened()) {
capture.read(inframe);
if (!inframe.empty()) {
pthread_mutex_lock(&FGmutex);
inframe.copyTo(buffer[(bufferIndex++) % 30]);
pthread_mutex_unlock(&FGmutex);
}
}
LOG("Camera Closed");
pthread_exit (NULL);
}
}
Here we have a lot of things that need to be explained :
- In initCamera :We initialize our camera here by saying which camera we want to access then we set the resolution we want to our frame to be.
- initCamera (cont.):We initialize a new thread(Frame-Grabber) for more information on native threads you can search posix thread or pthreads.
- surfaceChanged :it 's all Opengl initialization stuff.
- destroyTexture and releaseCamera is for exiting the app.
- drawBackground: is for rendering the frames.
- frameRetriever is the method called by the Frame Grabber thread.
The now you need android.mk file to compile the .cpp
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := opencv-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libopencv_java.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera1-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r4.2.0.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera2-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r4.1.1.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera3-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r4.0.3.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera4-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r4.0.0.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera5-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r3.0.1.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera6-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r2.3.3.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_MODULE := camera7-prebuilt
LOCAL_SRC_FILES = build/libs/$(TARGET_ARCH_ABI)/libnative_camera_r2.2.0.so
LOCAL_EXPORT_C_INCLUDES := $(LOCAL_PATH)/build/include
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
OPENGLES_LIB := -lGLESv1_CM
OPENGLES_DEF := -DUSE_OPENGL_ES_1_1
LOCAL_MODULE := NativeCamera
LOCAL_SHARED_LIBRARIES := opencv-prebuilt
LOCAL_SRC_FILES := CameraRenderer.cpp
LOCAL_LDLIBS += $(OPENGLES_LIB) -llog -ldl -lEGL
include $(BUILD_SHARED_LIBRARY)
I hope that you have enjoyed my first tutorial post and if you have any questions just leave a comment and feel free to ask.
UPDATE 1:
Here you can find the source code:
https://github.com/MESAI/NativeCamera
Add the openCV java shared library.
Add the build folder and its content in the JNI folder.
Then ndk-build.