I have an MKMapView with possibly hundreds of polygons drawn. Using MKPolygon and MKPolygonRenderer as one is suppose to on iOS7.
What I need is a way of acting upon the user touching one of the polygons. They represent an area on the map with a certain population density for example. On iOS6 the MKOverlays were drawn as MKOverlayViews so touch detection was more straightforward. Now using renderers I don't really see how this is suppose to be done.
I'm not sure this will help or is even relevant but as a reference I'll post some code:
This adds all the MKOverlays to the MKMapView using mapData.
-(void)drawPolygons{
self.polygonsInfo = [NSMutableDictionary dictionary];
NSArray *polygons = [self.mapData valueForKeyPath:@"polygons"];
for(NSDictionary *polygonInfo in polygons){
NSArray *polygonPoints = [polygonInfo objectForKey:@"boundary"];
int numberOfPoints = [polygonPoints count];
CLLocationCoordinate2D *coordinates = malloc(numberOfPoints * sizeof(CLLocationCoordinate2D));
for (int i = 0; i < numberOfPoints; i++){
NSDictionary *pointInfo = [polygonPoints objectAtIndex:i];
CLLocationCoordinate2D point;
point.latitude = [[pointInfo objectForKey:@"lat"] floatValue];
point.longitude = [[pointInfo objectForKey:@"long"] floatValue];
coordinates[i] = point;
}
MKPolygon *polygon = [MKPolygon polygonWithCoordinates:coordinates count:numberOfPoints];
polygon.title = [polygonInfo objectForKey:@"name"];
free(coordinates);
[self.mapView addOverlay:polygon];
[self.polygonsInfo setObject:polygonInfo forKey:polygon.title]; // Saving this element information, indexed by title, for later use on mapview delegate method
}
}
Then there is the delegate method for returning a MKOverlayRenderer for each MKOverlay:
-(MKOverlayRenderer *)mapView:(MKMapView *)mapView rendererForOverlay:(id<MKOverlay>)overlay{
/* ... */
MKPolygon *polygon = (MKPolygon*) overlay;
NSDictionary *polygonInfo = [self.polygonsInfo objectForKey:polygon.title]; // Retrieving element info by element title
NSDictionary *colorInfo = [polygonInfo objectForKey:@"color"];
MKPolygonRenderer *polygonRenderer = [[MKPolygonRenderer alloc] initWithPolygon:polygon];
polygonRenderer.fillColor = [UIColor colorWithRed:[[colorInfo objectForKey:@"red"] floatValue]
green:[[colorInfo objectForKey:@"green"] floatValue]
blue:[[colorInfo objectForKey:@"blue"] floatValue]
alpha:[[polygonInfo objectForKey:@"opacity"] floatValue]];
return polygonRenderer;
/* ... */
}
You're not going to able to determine this using the APIs that Apple provides. The best you could do with MapKit would be to maintain a separate database of all of your polygon coordinates as well as the order that the rendered versions are stacked. Then, when the user touches a point, you could do a spatial query on your secondary data to find the polygon(s) in question combined with the stacking order to determine which one they touched.
An easier way to do this if the polygons are relatively static would be to create a map overlay in TileMill with its own interactivity data. Here is an example map that contains interactivity data for countries:
https://a.tiles.mapbox.com/v3/examples.map-zmy97flj/page.html
Notice how some name & image data is retrieved when moused over in the web version. Using the MapBox iOS SDK, which is an open source MapKit clone, you can read that same data out on arbitrary gestures. An example app showing this is here:
https://github.com/mapbox/mapbox-ios-example
That solution might work for your problem and is pretty lightweight as compared to a secondary database and just-in-time calculation of the area touched.
I've done it.
Thanks to incanus and Anna!
Basically I add a TapGestureRecognizer to the MapView, convert the point tapped to map coordinates, go through my overlays and check with CGPathContainsPoint.
Adding TapGestureRecognizer. I did that trick of adding a second double tap gesture, so that the single tap gesture isn't fired when doing a double tap to zoom on map. If anyone knows a better way, I'm glad to hear!
Then, on the tap handler:
I could ask for the MKPolygonRenderer which already has the "path" property and use it, but for some reason it is always nil. I did read someone saying that I could call invalidatePath on the renderer and it does fill the path property but it just seems wrong as the point is never found inside any of the polygons. That is why I rebuild the path from the points. This way I don't even need the renderer and just make use of the MKPolygon object.
I'm considering to use both overlay and pin Annotation. I get the touch from the pin associated to the overlay.
FOR SWIFT 2.1 Find a point/coordinate in a Polygon
Here is the logic, without tap gestures, to find an annotation inside a polygon.
UPDATED(For Swift 3 & 4) I'm not sure why people are adding a UIGestureRecognizer to the mapView when mapView already has a number of gesture recognizers running. I found that these methods inhibit normal functionality of the mapView, in particular, tapping on an annotation. Instead I'd recommend subclassing the mapView and overriding the touchesEnded method. We can then use the methods others have suggested in this thread and use a delegate method to tell the ViewController to do whatever it needs to do. The "touches" parameter has a set of UITouch objects that we can use:
Don't forget to set the ViewController as the mapViewTouchDelegate. I also found it handy to make an extension for MKPolygon:
Then the function can be a little cleaner and the extension may be helpful somewhere else. Plus it's swifty-ier!
Here is my way in Swift