Gestures not getting detected in accessibility mod

2019-05-10 08:10发布

问题:

I am programming an app where I need to put a label onto top of the screen and whenever user touches it (or swipes finger above it), I need the label text to be read out loud. I tried the following ways:

First make the accessibility element of the label = YES.

[labelInfo setIsAccessibilityElement:YES];

But this doesn't work. Then I tried defining many gestures like two touches, single tap; long tap using one and two fingers etc., and in that gesture class I write UIAccessibilityPostNotification, so that the text can be read out loud. But it doesn't seem to work at all. This is how I did it:

UITapGestureRecognizer *TapOnLabel = [[UITapGestureRecognizer alloc] initWithTarget:labelInfo action:@selector(TapOnLabel:)];
[TapOnLabel setIsAccessibilityElement:YES];
[TapOnLabel setNumberOfTapsRequired:2];
[TapOnLabel setNumberOfTouchesRequired:1];
[labelInfo addGestureRecognizer:TapOnLabel];
[TapOnLabel release];

-(void) TapOnLabel:(UITapGestureRecognizer *)gestureRecognizer
{
  UIAccessibilityPostNotification(UIAccessibilityAnnouncementNotification, @"Where am    I?");
  NSLog(@"%@",labelInfo.text);
}

Is there something I am missing or no gestures work in accessibility mode, because many are predefined. For ex. two finger two tap is music on/off etc. If this is the case, I need atleast swipe to work. Any thoughts on this?

P.S. I have seen the solution of a similar question here. But it is particular to zoom gestures and not helpful to me. Also, all the gesture classes work perfectly without accessibility mode, so they have been written correctly.

回答1:

The gestures are all intercepted by VoiceOver. There is a gesture passthrough mode where you double-tap and hold your finger on the screen for 1s. You will then hear a tone and the intercept will be disabled until you lift your finger or complete a gesture. This gives you essentially 8 gestures you can then complete - the four swipes and the four drags.

But having said that, why does your application need to behave in this way? Why can you simply not add the accessibility label and/or hint and allow the VoiceOver focus and touch-to-explore work as they are designed?