Android: given the current location and the latitude / length of the places around me, how do I decide which places are visible on the camera?

I am creating an AR application for Android that will record the names of places / buildings / etc on camera viewing when I point to places with a live camera. I get my current location in lat and long, I can also get a list of places (with their long / long) in a certain radius from my current location.

However, the most confusing part for implementation is to show only those places that are currently visible in the camera (do not show places). One idea was to calculate the azimuth of my current location, and then calculate the azimuth of all the places that I get in a given radius, and then calculate the horizontal angle of the camera using getHorizontalViewAngle() and with all these parameters, calculate which of the azimuths of the place falls into this interval: t21>.

However, I think this is not a very effective way, can someone suggest my solution, or maybe someone had a similar problem and find a good solution. If it’s hard for me to understand my problem, let me know and I will try to explain in more detail.

+4
source share
1 answer

You are doing the right thing, but in our project we found it better (in terms of performance) to use a rotation matrix instead of azimuth. You can take a look at the source code of the augmented reality engine mixare. This is on github: https://github.com/mixare/mixare

The main logic is in the MixView class. The main idea is to convert something into vectors and project them onto the β€œvirtual” sphere that surrounds the phone.

NTN, Daniele

+3
source

Source: https://habr.com/ru/post/1382584/


All Articles