I am programming an app where I need to put a label onto top of the screen and whenever user touches it (or swipes finger above it), I need the label text to be read out loud. I tried the following ways:
First make the accessibility element of the label = YES
.
[labelInfo setIsAccessibilityElement:YES];
But this doesn't work.
Then I tried defining many gestures like two touches, single tap; long tap using one and two fingers etc., and in that gesture class I write UIAccessibilityPostNotification
, so that the text can be read out loud. But it doesn't seem to work at all. This is how I did it:
UITapGestureRecognizer *TapOnLabel = [[UITapGestureRecognizer alloc] initWithTarget:labelInfo action:@selector(TapOnLabel:)];
[TapOnLabel setIsAccessibilityElement:YES];
[TapOnLabel setNumberOfTapsRequired:2];
[TapOnLabel setNumberOfTouchesRequired:1];
[labelInfo addGestureRecognizer:TapOnLabel];
[TapOnLabel release];
-(void) TapOnLabel:(UITapGestureRecognizer *)gestureRecognizer
{
UIAccessibilityPostNotification(UIAccessibilityAnnouncementNotification, @"Where am I?");
NSLog(@"%@",labelInfo.text);
}
Is there something I am missing or no gestures work in accessibility mode, because many are predefined. For ex. two finger two tap is music on/off etc. If this is the case, I need atleast swipe to work. Any thoughts on this?
P.S. I have seen the solution of a similar question here. But it is particular to zoom gestures and not helpful to me. Also, all the gesture classes work perfectly without accessibility mode, so they have been written correctly.