I would like to know how to use the output from the "Rotation Vector Sensor" correctly. Currently, I have come up with the following and wanted to calculate the yaw and step from result[] to find out where the device (lying in landscape mode) points to. But I have problems with the results. The yaw calculation is pretty accurate, but the pitch is acting weird. Maybe someone can point me in the right direction how to use the data. Another thing I would like to know is whether the orientation of the device (landscape or portrait) has any effect on the output of this sensor. Thanks in advance.
private double max = Math.PI / 2 - 0.01; private double min = -max; private float[] rotationVectorAction(float[] values) { float[] result = new float[3]; float vec[] = values; float quat[] = new float[4]; float[] orientation = new float[3]; SensorManager.getQuaternionFromVector(quat, vec); float[] rotMat = new float[9]; SensorManager.getRotationMatrixFromVector(rotMat, quat); SensorManager.getOrientation(rotMat, orientation); result[0] = (float) orientation[0]; result[1] = (float) orientation[1]; result[2] = (float) orientation[2]; return result; } private void main () { float[] result = rotationVectorAction(sensorInput); yaw = result[0]; pitch = result[1]; pitch = (float) Math.max(min, pitch); pitch = (float) Math.min(max, pitch); float dx = (float) (Math.sin(yaw) * (-Math.cos(pitch))); float dy = (float) Math.sin(pitch); float dz = (float) (Math.cos(yaw) * Math.cos(pitch)); }
And in OpenGL ES 2.0, I installed this to move my camera around:
Matrix.setLookAtM(mVMatrix, 0, 0, 0, 0, dx, dy, dz, 0, 1, 0);
dima source share