This is how I did the same (my application was also in the landscape). First, I get the orientation values โโ(similar to how you do it):
final float pi = (float) Math.PI; final float rad2deg = 180/pi; public static float x; //pitch public static float y; //roll public static float z; //azimuth float[] gravity = new float[3]; float[] geomag = new float[3]; float[] inOrientMatrix = new float[16]; float[] outOrientMatrix= new float[16]; float orientation[] = new float[3]; public static GLSurfaceView glView; // (...) public void onSensorChanged(SensorEvent event) { switch (event.sensor.getType()){ case Sensor.TYPE_ACCELEROMETER: gravity = event.values.clone(); break; case Sensor.TYPE_MAGNETIC_FIELD: geomag = event.values.clone(); break; } if (gravity != null && geomag != null){ if (SensorManager.getRotationMatrix(inOrientMatrix, null, gravity, geomag)){ SensorManager.remapCoordinateSystem(inOrientMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, outOrientMatrix); SensorManager.getOrientation(outOrientMatrix, orientation); x = orientation[1]*rad2deg; //pitch y = orientation[0]*rad2deg; //azimuth z = orientation[2]*rad2deg; //roll glView.requestRender(); ๏ฝ
Then in my rendering in onDrawFrame (GL10 gl) I do:
gl.glLoadIdentity(); GLU.gluLookAt(gl, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, -1.0f, 0.0f, 1.0f, 0.0f); gl.glRotatef(MainActivity.x, 1.0f, 0.0f, 0.0f); gl.glRotatef(MainActivity.y, 0.0f, 1.0f, 0.0f); gl.glRotatef(MainActivity.z, 0.0f, 0.0f, 1.0f);
In other words, I rotate the whole world around me. It would be best to change the direction of the camera with glLookAt, with the eye located at (0,0,0) and (0,1,0) being a vector up, but I just could not get my center_view x, y, z right.
Hope this helps ...