I am working on a Desktop based project using visual C++/MFC. There are lots of buttons. But problem is, It should work on touch screen monitor where no mouse/keyboard are available.
So, will ON_BN_CLICKED work as touch event in touch screen monitor? Or I have to handle it other ways?
By "Touch" event if you mean screen tap then Yes, they CAN be treated as same.
Windows 7 provides built in support for applications that do not provide any explicit support for touch and ink support to receive input via the Onscreen Keyboard and Writing Pad.
Windows will primarily use touch in much the same mode as a mouse, with screen taps equating to mouse clicks. So, 'ON_BN_CLICKED' will work for screen taps.
That being said, you can provide explicit support for Touch support in one of two ways:
Gestures: Windows-provided mapping of distinctive touch sequences into gestures like zoom and pan. MFC further translates these gestures into a simplified set of CWnd virtual methods that can be overridden as required.
Touch Messages: Registering to receive the low-level touch messages which may be coming from multiple touch points simultaneously, and responding to these touch events in the message handler.
Source: Check this article for details.