I am trying to calculate the lookat matrix myself, instead of using gluLookAt (). My problem is that my matrix is ββnot working. using the same parameters on gluLookAt really works.
my way to create a lookat matrix:
Vector3 Eye, At, Up; //these should be parameters =) Vector3 zaxis = At - Eye; zaxis.Normalize(); Vector3 xaxis = Vector3::Cross(Up, zaxis); xaxis.Normalize(); Vector3 yaxis = Vector3::Cross(zaxis, xaxis); yaxis.Normalize(); float r[16] = { xaxis.x, yaxis.x, zaxis.x, 0, xaxis.y, yaxis.y, zaxis.y, 0, xaxis.z, yaxis.z, zaxis.z, 0, 0, 0, 0, 1, }; Matrix Rotation; memcpy(Rotation.values, r, sizeof(r)); float t[16] = { 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, -Eye.x, -Eye.y, -Eye.z, 1, }; Matrix Translation; memcpy(Translation.values, t, sizeof(t)); View = Rotation * Translation; // i tried reversing this as well (translation*rotation)
Now when I try to use this matrix by calling glMultMatrixf, nothing is displayed in my engine, when using the same image, eye, lookat and up on gluLookAt works fine, as I said earlier.
glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glMultMatrixf(View);
the problem should be somewhere in the code indicated here, I know that the problem is not in my Vector3 / Matrix classes, because they work fine when creating the projection matrix.