Force a shell script to fflush

2020-02-10 14:46发布

I was wondering if it was possible to tell bash that all calls to echo or printf should be followed up by a subsequent call to fflush() on stdout/stderr respectively?

A quick and dirty solution would be to write my own printf implementation that did this and use it in lieu of either built in, but it occurred to me that I might not need to.

I'm writing several build scripts that run at once, for debugging needs I really need to see messages that they write in order.

标签: linux bash shell
3条回答
三岁会撩人
2楼-- · 2020-02-10 15:25

Maybe "stty raw" can help with some other tricks for end-of-lines handling. AFAIK "raw" mode turns off line based buffering, at least when used for serial port ("stty raw < /dev/ttyS0").

查看更多
Explosion°爆炸
3楼-- · 2020-02-10 15:35

If you force the file to be read, it seems to cause the buffer to flush. These work for me.

Either read the data into a useless variable:

    x=$(<$logfile)

Or do a UUOC:

    cat $logfile > /dev/null
查看更多
别忘想泡老子
4楼-- · 2020-02-10 15:36

If comands use stdio and are connected to a terminal they'll be flushed per line. Otherwise you'll need to use something like stdbuf on commands in a pipe line http://www.pixelbeat.org/programming/stdio_buffering/

tl;dr: instead of printf ... try to put to the script stdbuf -o0 printf .., or stdbuf -oL printf ...

查看更多
登录 后发表回答