Here's my scenario:
- process A spawns child process B and spins threads to drain B's outputs.
- process B spawns daemon process C and drains its outputs, too.
- process B finishes, daemon process still lives.
- process A finds out that process B exited via process.waitFor(). However, it's stuck on reading the input streams of process B. It's because B has started a daemon. The input stream receives EOF only when the process C exits.
This only happens on Windows. I'm using the ProcessBuilder. Here're the solutions I came up with and I'd like to hear your feedback as none of the solutions I really like:
- I can use jna to spawn the daemon process C. This way I can create a process that is 'detached enough' and process A is not stuck on draining the streams from B. It works but I'm not very keen on that solution because it means some native code (and lots of that since I'm keen on consuming the inputs). Some inspiration how to do it via JNA is here: http://yajsw.sourceforge.net (however it contains way more stuff than mere process starting).
- Run on jre7. Jdk7 brings some new goodies to the ProcessBuilder, e.g. inheritIO() stuff that also solves my problem. Apparently, when inheritIO() is turned on, I can simply close all streams in the daemon process C (which I do anyway because it'a daemon) and that solves the problem. However, I need to run on jre5+
- Close the System.out and System.err in process B before spawning the daemon process C. Again, it solves the problem but I really need those streams to work in process B as I write useful stuff to them. No good. I hoped I could take advantage of this characteristic by placing some kind of a bootstrap process between B & C but that didn't solve the problem.
- I don't have that problem on linux so could I only run on linux? No, I can't.
- Made the process A drain the outputs of process B in a non-blocking way. This somewhat works but it's not convenient. E.g. inputStream.read() is not interruptible. I could use inputStream.available() but it doesn't distinguish between EOF and zero-bytes-available. So the solution is only good if process A is never interested in B's output EOF. Also, this solution seems to be more CPU intensive and generally... feels awkward and not really bullet proof.
- Run process C in a --dry-run mode where it just checks if it can be started. It tries to start, sends welcome message and exits. It's no longer long-running so it will not block reads. Process B can gain enough confidence that C can be started and we can use relatively simple JNA code to spawn detached process without consuming its outputs (it's the consuming of outputs makes the JNA-related code messy and heavyweight). The only problem is that we no longer consumer process' C outputs but it can be solved by making C write to a well known file that process B can consume. This solution is more like a big and ugly workaround but kind of workable for us. Anyways, we are trying the solution 1) at the moment.
I would really appreciate any hints!
I just encountered the same problem. I think I have a workaround to the problem. In process A, I have the following code fragment after Process.waitFor(), where outT and errT are the threads to read process B's stdout and stderr, respectively:
Not sure if p.destroy() is needed, but I have been trying all kinds of combinations to deal with the problem.
Anyway, in the run() method of the outT/errT threads, I have the following, where the 'pipe' variable is a Writer instance I am capturing stdout/stderr of the sub-process to. The 'in' variable is the stdout, or stderr, stream obtained from Process:
It seems that I never get an EOF indication from any sub-process, even after the sub-process terminates, hence, all the chicanery above to prevent stale threads and blocking.