In some apps, it makes sense for the app to directly handle keyboard shortcuts which are otherwise bound to system wide combinations. For example, ⌘-Space (normally Spotlight) or ⌘-Tab (normally app switcher). This works in various Mac apps, such as VMWare Fusion, Apple's own Screen Sharing and Remote Desktop clients (forwarding the events to the VM or server, respectively, instead of handling them locally), and also some similar third-party apps in the App Store.
We would like to implement such a mode in the app we're working on, but are having a hard time working out how to do it. I should point out that the app in question is a regular foreground app, is sandboxed, and any solution must comply with App Store rules. The fact that other apps on the store can do it implies that this must be possible.
To be clear, we want to:
- Detect and handle all key presses, including those bound to global shortcuts.
- Prevent global shortcuts from triggering their globally bound effect.
Apple's Event Architecture document suggests that the foreground application should already be receiving these events. (It only talks about earlier levels handling things such as the power and eject buttons, which is fine.) It goes on to suggest, and the key events document also implies that NSApplication
's sendEvent:
method is what detects potential shortcuts based on modifier flags, dispatching them to the windows and if that fails, on to the menu bar. It's not explicitly stated what happens to globally-bound shortcuts.
I tried subclassing NSApplication
and overriding sendEvent:
. No matter if I pass through all events to the superclass implementation, or if I say, filter modifier key events, when I press ⌘-Space, I receive the events for pressing and releasing the command (⌘) key, but not the spacebar. The Spotlight UI always pops up.
I haven't found much information on subclassing NSApplication and its early event handling, from Apple or otherwise. I can't seem to find out at what level global shortcuts are detected and handled.
Can someone please point me in the right direction?
Possible solutions which don't work:
Suggestions I've seen in other Stack Overflow posts but which don't apply to the other apps I've seen which do this (and which would break App Store rules):
- Accessibilty APIs (needs special permission)
- Event taps/hooks (needs to run as root)
Both of these would be overkill anyway, as they let you intercept all events at all times, not just while your app is the foreground app.
NSevent
's addGlobalMonitorForEventsMatchingMask:handler:
meanwhile doesn't prevent the global shortcut handler from firing for those events, so I didn't even bother trying it.
Ok, so the Cocoa event methods and Quartz event taps are out because they either require root or accessibility access, or do not catch events before the dock does.
Carbon's PushSymbolicHotKeyMode
is out because, per the docs, it requires accessibility access.
Carbon's RegisterEventHotKey
is probably out because Apple doesn't seem to allow it (see my link in comment on the question). However, even so, I tested and you can't use it to catch Command+Tab.
I made a quick proof-of-concept of how this can work, but YMMV:
- implement the
KeyboardWatcher
example class from this answer. You will need to link IOKit.
- add the Hardware - USB (com.apple.security.device.usb) sandboxing entitlement. This is necessary because KeyboardWatcher uses HID to catch key presses
- the
Handle_DeviceEventCallback
will give you the keys that are pressed. You can obviously modify this to your needs
- Use
SetSystemUIMode
to block the task switcher and Spotlight. You will need to link Carbon.
SetSystemUIMode(kUIModeContentSuppressed, kUIOptionDisableProcessSwitch);
Note that this will only work while your app is in the foreground (probably what you want). I set this on my view using a tracking rectangle, so it only takes effect when the mouse is over my view (like in Remotix):
- (void)viewDidLoad {
[super viewDidLoad];
NSTrackingArea* trackingArea = [[NSTrackingArea alloc] initWithRect:[self.view bounds] options: (NSTrackingMouseEnteredAndExited |
NSTrackingActiveAlways) owner:self userInfo:nil];
[self.view addTrackingArea:trackingArea];
}
- (void) mouseEntered:(NSEvent*)theEvent {
SetSystemUIMode(kUIModeContentSuppressed, kUIOptionDisableProcessSwitch);
}
- (void) mouseExited:(NSEvent*)theEvent {
SetSystemUIMode(kUIModeNormal, 0);
}
Remotix seems to link Carbon and IOKit, but I can't see if they have the USB entitlement (I tried the demo, not the App Store version). It's possible they are doing something like this.
The normal way to achieve this is by installing a Quartz event tap. However to receive events targeting other applications, you need (as you say) to be either root, or have accessibility access enabled for your app.
It seems not possible to use an event tap with the current sandboxing rules. This is confirmed in the developer forum. The link is login only, but to quote from the thread:
Is there are any chance to handle events that comming from media keys by prevents launch iTunes. Before sandbox it was possible by create CGEventTap but now sandbox deny using hid-controll.
No, this is not currently possible within App Sandbox.
I'm not sure of another way to do this; and I'd be interested to know what apps in the App Store can?
VMWare Fusion is clearly not sandboxed, and Apple's own apps are exempt from the rules. Remember that sandboxing is only enforced on new apps added after it was introduced, in 2012. Apps added before that date do not have sandboxing enforced. See this answer.
I solved this ages ago but I only just noticed I never posted it here. The answer ended up involving CGSSetGlobalHotKeyOperatingMode()
. This is not a public API, but there are a number of Mac App Store apps which use it by obfuscating the function name and looking it up dynamically. Apple doesn't seem to mind. The API is pretty straightforward to use, and there's plenty of open example source code floating about.