Sending (serial) break using windows (XP+) api

2019-07-02 18:46发布

Is there a better way to send a serial break then the setcommbreak - delay - clearcommbreak sequence?

I have to communicate with a microcontroller that uses serial break as the start of a packet on 115k2, and the setcommbreak has two problems:

  • with 115k2, the break is well below 1ms, and it will get timing critical.

  • Since the break must be embedded in the packet stream at the correct position, I expect trouble with the fifo.

Is there a better way of doing this, without moving the serial communication to a thread without fifo ? The UART is typically a 16550+

I have a choice in the sense that the microcontroller setup can be switched(other firmware) to a more convention packet format, but the manual warns that the "break" way features hardware integrity checking of the serial.

Compiler is Delphi (2009/XE), but any code or even just a reference is welcome.

1条回答
The star\"
2楼-- · 2019-07-02 19:30

The short answer is that serial programming with Windows is fairly limited :-(

You're right that the normal way of sending a break is with SetCommBreak(), and yes, you have to handle the delay yourself - which tends to mean the break ends up substantially longer than it needs to be. The good news is that this doesn't usually matter - most devices expecting a break will treat a much longer break in exactly the same way as a short one.

In the event that your microcontroller is fussy about the precise duration of the break, one way of achieving a shorter, precisely-defined break is to change the baud rate on the port to a slower rate, send a zero byte, then change it back again.

The reason that this works is that a byte sent to the serial port is sent as (usually) one start bit (a zero), followed by the bits in the byte, followed by one or more stop bits (high bits). A 'break' is a sequence of zero bits that is too long to be a byte - i.e. the stop bits don't come in time. By choosing a slower baud rate and sending a zero, you end up holding the line at zero for longer than the receiver expects a byte to be, so it interprets it as a break. (It's up to you whether to determine the baud rate to use by precise calculation or trial-and-error of what the microcontroller seems to like :-)

Of course, either method (SetCommBreak() or baud changing) requires you to know when all data has been sent out of the serial port (i.e. there's nothing left in the transmit FIFO). This nice article about Windows Serial programming describes how to use SetCommMask(), WaitCommEvent() etc. to determine this.

查看更多
登录 后发表回答