I wrote an application that detects all active Windows and puts them into a list.
Is there a way to simulate a mouseclick on a spot on the screen relative to the Windows location without actually moving the cursor?
I don't have access to the buttons handle that is supposed to be clicked, only to the handle of the window
To answer your specific question - NO. Mouse clicks can only be directed where the mouse cursor actually resides at the time of the click. The correct way to simulate mouse input is to use
SendInput()
(ormouse_event()
on older systems). But those functions inject simulated events into the same input queue that the actual mouse driver posts to, so they will have a physical effect on the mouse cursor - ie move it around the screen, etc.How do I simulate input without SendInput?
When something gets added to a queue, it takes time for it to come out the front of the queue
The only real way to do what you are asking for is to find the
HWND
of the UI control that is located at the desired screen coordinates. Then you can either:send
WM_LBUTTONDOWN
andWM_LBUTTONUP
messages directly to it. Or, in the case of a standard Win32 button control, send a singleBM_CLICK
message instead.use the
AccessibleObjectFromWindow()
function of the UI Automation API to access the control'sIAccessible
interface, and then call itsaccDoDefaultAction()
method, which for a button will click it.That being said, ...
You can access anything that has an
HWND
. Have a look atWindowFromPoint()
, for instance. You can use it to find theHWND
of the button that occupies the desired screen coordinates (with caveats, of course: WindowFromPoint, ChildWindowFromPoint, RealChildWindowFromPoint, when will it all end?).