3/23/13

Outdoor/Geo Augmented Reality: Camera Orientation from Mobile Sensors (using LibGDX) in Android

Hi again everyone today's topic is on AR, specially the outdoor AR (based on your geo location).
I am currently working on AR browser (have both outdoor and marker based).In the beginning I had a big challenge of how to show POIs on the screen.My choices was to use the (1) Android SDK using SurfaceView and all my work will be on a canvas so every thing is 2D (2) use OpenGL and deal with it's mess,specially that I am already using the camera rendering module in my last post which is built with OpenGL.After some research I got an crazy idea of using a gaming engine or a gaming framework that will handle all the OpenGL mess and hide it in a new API calls.Back to research again I found that LibGDX is a perfect candidate that will solve all the issues with high performance too.

So now here is the situation we want to show a shape that is positioned in a specific place in the OpenGL coordinates system in front of the OpenGL camera.And when changing the mobile orientation the shape don't leave it's position.This will make us feel like as if the shape is augmented in a specific position in the real world.



To achieve this, we need to make our orientation sensors give us the orientation of our mobile camera looking point and and up vector.

Notes:


  • you should be familiar with LibGDX because there is a lot of classes used in the code like Matrix4 vector3 and they are introduced in LibGDX + you need to know the life cycle of LibGDX.
  • The orientation of the phone is landscape here (for portrait mode the orientation matrix taken from the sensors need to be rotated)


Here is a sample code for getting the orientation :(this method is taken from a big class so any variable not defined here it is considered as a global variable)


 public void start(Context context) {
        listener = new SensorEventListener() {
         private float orientation[]=new float[3];
         private float acceleration[]=new float[3];


         public void onAccuracyChanged(Sensor arg0, int arg1){}

         public void onSensorChanged(SensorEvent evt) {
          int type=evt.sensor.getType();
          //Smoothing the sensor.
          if (type == Sensor.TYPE_MAGNETIC_FIELD) {
           orientation  =lowPass( evt.values, orientation ,0.1f); 
          } else if (type == Sensor.TYPE_ACCELEROMETER) {
           
           acceleration  =lowPass( evt.values, acceleration,0.05f );
          }
          
          if ((type==Sensor.TYPE_MAGNETIC_FIELD) || (type==Sensor.TYPE_ACCELEROMETER)) {
           float newMat[]=new float[16];
           SensorManager.getRotationMatrix(newMat, null, acceleration, orientation);
     SensorManager.remapCoordinateSystem(newMat,
       SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
       newMat);
     Matrix4 matT = new Matrix4(newMat).tra();
     float[] newLookAt = {0,0,-far,1};//far = 1000
     float[] newUp = {0,1,0,1};
     Matrix4.mulVec(matT.val, newLookAt);
     Matrix4.mulVec(matT.val, newUp);
     matrix = matT.val;
           lookAt[0]= newLookAt[0]+position[0];//can remove position if the camera always at <0>
           lookAt[1]= newLookAt[1]+position[1];
           lookAt[2]= newLookAt[2]+position[2];
           up=newUp;
           
          }
         }
        };
        sensorMan = (SensorManager)context.getSystemService(Context.SENSOR_SERVICE);
  sensorAcce = sensorMan.getSensorList(Sensor.TYPE_ACCELEROMETER).get(0);
  sensorMagn = sensorMan.getSensorList(Sensor.TYPE_MAGNETIC_FIELD).get(0);
  sensorOrien = sensorMan.getSensorList(Sensor.TYPE_ORIENTATION).get(0);
  sensorMan.registerListener(listener, sensorAcce, SensorManager.SENSOR_DELAY_FASTEST);
  sensorMan.registerListener(listener, sensorMagn, SensorManager.SENSOR_DELAY_FASTEST); 
  sensorMan.registerListener(listener, sensorOrien, SensorManager.SENSOR_DELAY_NORMAL);
 }

As you have seen there is a lowpass filter applied to the data coming from the sensors because the data is too distorted will cause the shape to jitter on the screen.

Here is the filter:

public class LowPassFilter {

 float[] output;

 public LowPassFilter() {

 }

 // LowPass Filter
 public  float[] lowPass(float[] input, float alpha) {
  if (output == null){
   output= input;
   return output;
  }

  for (int i = 0; i < input.length; i++) {
   output[i] = output[i] + alpha * (input[i] - output[i]);
  }
  return output;
 }

}

Ok now we need to update our camera lookAt vector and up Vector in the render loop of LibGDX


public void render(){
  
  camera.update();
  upVector = orientation.getUp();// this should get you the up variable in the start method above
  camera.up.set(upVector[0],upVector[1], upVector[2]);
  lookVector = orientation.getLookAt();// this should get you the lookAt variable in the start method above
  camera.lookAt(lookVector[0],lookVector[1],lookVector[2]);
  camera.apply(Gdx.gl10);
 }

At the end we achieved our target by making the OpenGL Camera change it's orientation when the senors changes. so now we have the effect of fixed shape in the real world. I hope that you have found something usefull in my post and if you have any questions,feel free to ask.

Here is a repository for the code above:
https://github.com/MESAI/AugmentedRealityLibGDX/tree/master/Augmented%20Reality%20With%20Libgdx


12 comments:

  1. Are u converting accelerometer data to position in order to track the object?

    ReplyDelete
    Replies
    1. If I understood you right I used the accelerometer and magnetometer (Compass) to get the orientation not the position.like in the video,you can see me rotating and turing around but I didn't move from my place because you can't get from the accelerometer the position because of the large error and the lack of information (like V0 ) in addition to you can't distinguish between moving with constant speed and just standing without moving.Later in this project,I combined the GPS with the orientation introduced in this post so that you can get an outdoor position.I hope that I answered your question.

      Delete
  2. ya we tried to get the position with accelerometer values and as you said the error keeps on propagating.we are trying to implement your code but we are stuck with some errors as we are not familiar with libgdx. Can you please share the working code to my email id : naveenksid@gmail.com so that we can rectify our mistakes.

    ReplyDelete
  3. Looks like great work.. ANd the core of this whole effort, How to link Camera preview with libGdx is'nt mentioned anywhere.. Or am I missing something..

    ReplyDelete
  4. your amazing work, I am new on android codes and have many difficulties, I would study your code, you could help me?

    ReplyDelete
  5. Hi,
    I have a simple code in AR Toolkit which I need to run this code as a Markerless program. Could you please help me How I can do it. I could load the augmented object without show the marker but when the camera is moving the object follows the camera an remains in the center of screen.
    Thank you in advance
    shahinkey@yahoo.com

    ReplyDelete
  6. What libGDX version was you using?
    I tried run your example and there are a lot of errors, probably because of the libgdx version.

    ReplyDelete
  7. Your work is really nice. But can you explain how I detect a touch gesture on a shape? Please help me. Thanks so much!

    ReplyDelete
  8. This really nice and interesting topic for sure I learn more from here.

    Virtual Reality Exhibit

    ReplyDelete
  9. What a thrilling post. It is extremely chock-full of useful information. Thanks for such a great info. best seats for spin bikes

    ReplyDelete