Using a rotation vector sensor

I would like to know how to use the output from the "Rotation Vector Sensor" correctly. Currently, I have come up with the following and wanted to calculate the yaw and step from result[] to find out where the device (lying in landscape mode) points to. But I have problems with the results. The yaw calculation is pretty accurate, but the pitch is acting weird. Maybe someone can point me in the right direction how to use the data. Another thing I would like to know is whether the orientation of the device (landscape or portrait) has any effect on the output of this sensor. Thanks in advance.

 private double max = Math.PI / 2 - 0.01; private double min = -max; private float[] rotationVectorAction(float[] values) { float[] result = new float[3]; float vec[] = values; float quat[] = new float[4]; float[] orientation = new float[3]; SensorManager.getQuaternionFromVector(quat, vec); float[] rotMat = new float[9]; SensorManager.getRotationMatrixFromVector(rotMat, quat); SensorManager.getOrientation(rotMat, orientation); result[0] = (float) orientation[0]; result[1] = (float) orientation[1]; result[2] = (float) orientation[2]; return result; } private void main () { float[] result = rotationVectorAction(sensorInput); yaw = result[0]; pitch = result[1]; pitch = (float) Math.max(min, pitch); pitch = (float) Math.min(max, pitch); float dx = (float) (Math.sin(yaw) * (-Math.cos(pitch))); float dy = (float) Math.sin(pitch); float dz = (float) (Math.cos(yaw) * Math.cos(pitch)); } 

And in OpenGL ES 2.0, I installed this to move my camera around:

 Matrix.setLookAtM(mVMatrix, 0, 0, 0, 0, dx, dy, dz, 0, 1, 0); 
+6
source share
3 answers
Finally, I decided it myself. The reason the step is not working properly is the difference between INPUT and OUTPUT from SensorManager.getQuaternionFromVector(quat, vec); . The method expects the vec vector to be (x|y|z|w) , and the quat output looks like this: (w|x|y|z) . In my case, I just had to quat first value of the quat array to the end, and it worked like a charm.
+4
source

Our passion did not help me. Let's say SensorManager.getQuaternionFromVector(quat, vec); how docs indicate that SensorManager.getRotationMatrixFromVector() requests a rotation vector, which is returned by the ROTATION_VECTOR sensor.

 private float[] rotationVectorAction(float[] values) { float[] result = new float[3]; float vec[] = values; float[] orientation = new float[3]; float[] rotMat = new float[9]; SensorManager.getRotationMatrixFromVector(rotMat, vec); SensorManager.getOrientation(rotMat, orientation); result[0] = (float) orientation[0]; //Yaw result[1] = (float) orientation[1]; //Pitch result[2] = (float) orientation[2]; //Roll return result; } 
+2
source

I can answer part of this question.

The orientation of the device does not affect the sensor output, since the sensors always refer to the default orientation of the device. However, to draw material on the screen, such as the compass needle pointing north, you need to consider the orientation of the device, because the coordinate system of the canvas you draw onto was rotated if the device is not in default.

0
source

Source: https://habr.com/ru/post/945890/


All Articles