Read streaming input from subprocess.communicate()

PythonSubprocess

Python Problem Overview


I'm using Python's subprocess.communicate() to read stdout from a process that runs for about a minute.

How can I print out each line of that process's stdout in a streaming fashion, so that I can see the output as it's generated, but still block on the process terminating before continuing?

subprocess.communicate() appears to give all the output at once.

Python Solutions


Solution 1 - Python

To get subprocess' output line by line as soon as the subprocess flushes its stdout buffer:

#!/usr/bin/env python2
from subprocess import Popen, PIPE

p = Popen(["cmd", "arg1"], stdout=PIPE, bufsize=1)
with p.stdout:
    for line in iter(p.stdout.readline, b''):
        print line,
p.wait() # wait for the subprocess to exit

iter() is used to read lines as soon as they are written to workaround the read-ahead bug in Python 2.

If subprocess' stdout uses a block buffering instead of a line buffering in non-interactive mode (that leads to a delay in the output until the child's buffer is full or flushed explicitly by the child) then you could try to force an unbuffered output using pexpect, pty modules or unbuffer, stdbuf, script utilities, see Q: Why not just use a pipe (popen())?


Here's Python 3 code:

#!/usr/bin/env python3
from subprocess import Popen, PIPE

with Popen(["cmd", "arg1"], stdout=PIPE, bufsize=1,
           universal_newlines=True) as p:
    for line in p.stdout:
        print(line, end='')

Note: Unlike Python 2 that outputs subprocess' bytestrings as is; Python 3 uses text mode (cmd's output is decoded using locale.getpreferredencoding(False) encoding).

Solution 2 - Python

Please note, I think J.F. Sebastian's method (below) is better.


Here is an simple example (with no checking for errors):

import subprocess
proc = subprocess.Popen('ls',
                       shell=True,
                       stdout=subprocess.PIPE,
                       )
while proc.poll() is None:
    output = proc.stdout.readline()
    print output,

If ls ends too fast, then the while loop may end before you've read all the data.

You can catch the remainder in stdout this way:

output = proc.communicate()[0]
print output,

Solution 3 - Python

I believe the simplest way to collect output from a process in a streaming fashion is like this:

import sys
from subprocess import *
proc = Popen('ls', shell=True, stdout=PIPE)
while True:
    data = proc.stdout.readline()   # Alternatively proc.stdout.read(1024)
    if len(data) == 0:
        break
    sys.stdout.write(data)   # sys.stdout.buffer.write(data) on Python 3.x

The readline() or read() function should only return an empty string on EOF, after the process has terminated - otherwise it will block if there is nothing to read (readline() includes the newline, so on empty lines, it returns "\n"). This avoids the need for an awkward final communicate() call after the loop.

On files with very long lines read() may be preferable to reduce maximum memory usage - the number passed to it is arbitrary, but excluding it results in reading the entire pipe output at once which is probably not desirable.

Solution 4 - Python

If you want a non-blocking approach, don't use process.communicate(). If you set the subprocess.Popen() argument stdout to PIPE, you can read from process.stdout and check if the process still runs using process.poll().

Solution 5 - Python

If you're simply trying to pass the output through in realtime, it's hard to get simpler than this:

import subprocess

# This will raise a CalledProcessError if the program return a nonzero code.
# You can use call() instead if you don't care about that case.
subprocess.check_call(['ls', '-l'])

See the docs for subprocess.check_call().

If you need to process the output, sure, loop on it. But if you don't, just keep it simple.

Edit: J.F. Sebastian points out both that the defaults for the stdout and stderr parameters pass through to sys.stdout and sys.stderr, and that this will fail if sys.stdout and sys.stderr have been replaced (say, for capturing output in tests).

Solution 6 - Python

myCommand="ls -l"
cmd=myCommand.split()
# "universal newline support" This will cause to interpret \n, \r\n and \r     equally, each as a newline.
p = subprocess.Popen(cmd, stderr=subprocess.PIPE, universal_newlines=True)
while True:    
    print(p.stderr.readline().rstrip('\r\n'))

Solution 7 - Python

Adding another python3 solution with a few small changes:

  1. Allows you to catch the exit code of the shell process (I have been unable to get the exit code while using the with construct)
  2. Also pipes stderr out in real time
import subprocess
import sys
def subcall_stream(cmd, fail_on_error=True):
    # Run a shell command, streaming output to STDOUT in real time
    # Expects a list style command, e.g. `["docker", "pull", "ubuntu"]`
    p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, bufsize=1, universal_newlines=True)
    for line in p.stdout:
        sys.stdout.write(line)
    p.wait()
    exit_code = p.returncode
    if exit_code != 0 and fail_on_error:
        raise RuntimeError(f"Shell command failed with exit code {exit_code}. Command: `{cmd}`")
    return(exit_code)

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionHeinrich SchmetterlingView Question on Stackoverflow
Solution 1 - PythonjfsView Answer on Stackoverflow
Solution 2 - PythonunutbuView Answer on Stackoverflow
Solution 3 - PythonChiara CoetzeeView Answer on Stackoverflow
Solution 4 - PythonLukáš LalinskýView Answer on Stackoverflow
Solution 5 - PythonNateView Answer on Stackoverflow
Solution 6 - PythonPetr JView Answer on Stackoverflow
Solution 7 - Pythonbigfoot56View Answer on Stackoverflow