MKOverlayView and OpenGL

I currently have a UIView that draws radar data on top of MKMapView using OpenGL. Due to the level of detail of the radar image, OpenGL is required (CoreGraphics is not fast enough).

All the images that I draw are saved in MKMapPoints . I select them according to the standard CLLocationCoordinate2D , because their length does not depend on latitude. Basic drawing method:

  • Add GLView as a subset of MKMapView and set GLView.frame = MKMapView.frame .
  • Using GLOrthof , set the GLView projection to the current visible MKMapRect on the map. Here is the code that does this.

     CLLocationCoordinate2D coordinateTopLeft = [mapView convertPoint:CGPointMake(0, 0) toCoordinateFromView:mapView]; MKMapPoint pointTopLeft = MKMapPointForCoordinate(coordinateTopLeft); CLLocationCoordinate2D coordinateBottomRight = [mapView convertPoint:CGPointMake(mapView.frame.size.width, mapView.frame.size.height) toCoordinateFromView:mapView]; MKMapPoint pointBottomRight = MKMapPointForCoordinate(coordinateBottomRight); glLoadIdentity(); glOrthof(pointTopLeft.x, pointBottomRight.x, pointBottomRight.y, pointTopLeft.y, -1, 1); 
  • Set the viewport to the correct size using glViewport(0, 0, backingWidth, backingHeight) , where backingWidth and backingHeight are the size of the mapView in points.

  • Draw with glDrawArrays . Not sure if this is important, but GL_VERTEX_ARRAY and GL_TEXTURE_COORD_ARRAY are included during the rally.

Using this method, everything works fine. The drawing is performed as intended. The only problem is that it is a subclass of mapView (and not an overlay), a radar image is drawn on top of any other MKAnnotations and MKOverlays . I need this layer to be drawn under other annotations and overlays.

I tried to do this in order to make the GLView subheading of a custom MKOverlayView instead of a mapView . What I did was give MKOverlay a boundingMapRect of MKMapRectWorld and set the GLView frame the same way I set the projection (since the frame a MKOverlayView is determined by MKMapPoints , not CGPoints ). Again, here is the code.

 CLLocationCoordinate2D coordinateTopLeft = [mapView convertPoint:CGPointMake(0, 0) toCoordinateFromView:mapView]; MKMapPoint pointTopLeft = MKMapPointForCoordinate(coordinateTopLeft); CLLocationCoordinate2D coordinateBottomRight = [mapView convertPoint:CGPointMake(mapView.frame.size.width, mapView.frame.size.height) toCoordinateFromView:mapView]; MKMapPoint pointBottomRight = MKMapPointForCoordinate(coordinateBottomRight); glRadarView.frame = CGRectMake(pointTopLeft.x, pointTopLeft.y, pointBottomRight.x - pointTopLeft.x, pointBottomRight.y - pointTopLeft.y); 

When I do this, GLView positioned correctly on the screen (in the same place as it was when it was a mapView ), but the drawing no longer works correctly. When the image appears, it is not the right size, not in the right place. I did a check, and backingWidth and backingHeight still represent the size of the view in points (as it should be).

Any idea why this is not working?

+4
source share
2 answers

I haven’t been in iphone for too long to really fully understand your code, but it seems like I recall when I messed around with Open GL on the iPhone some time ago, I found that I had to save my own z -Index and just draw the elements in that order ... Each drawing operation was correctly rotated to 3d, but something drawn later was always on top of something that was done earlier. One of my early testing programs drew a grid on the surface, and then made it all roll over. My expectation was that the grid would disappear when the back of the object collided with me, but it remained visible because it was pulled out later in a separate operation (IIRC)

It is possible that I did something wrong that caused this problem, but my solution was to order my draws with z index.

Can you draw an image first using your first method?

0
source

I think you should just set the viewport before setting the projection mode.

0
source

Source: https://habr.com/ru/post/1389756/


All Articles