Does python logging flush every log?

PythonPerformanceLoggingFlush

Python Problem Overview


When I write a log to file using the standard module logging, will each log be flushed to disk separately? For example, will the following code flush log by 10 times?

logging.basicConfig(level=logging.DEBUG, filename='debug.log')
    for i in xrange(10):
        logging.debug("test")

if so, will it slow down ?

Python Solutions


Solution 1 - Python

Yes, it does flush the output at every call. You can see this in the source code for the StreamHandler:

def flush(self):
    """
    Flushes the stream.
    """
    self.acquire()
    try:
        if self.stream and hasattr(self.stream, "flush"):
            self.stream.flush()
    finally:
        self.release()

def emit(self, record):
    """
    Emit a record.

    If a formatter is specified, it is used to format the record.
    The record is then written to the stream with a trailing newline.  If
    exception information is present, it is formatted using
    traceback.print_exception and appended to the stream.  If the stream
    has an 'encoding' attribute, it is used to determine how to do the
    output to the stream.
    """
    try:
        msg = self.format(record)
        stream = self.stream
        stream.write(msg)
        stream.write(self.terminator)
        self.flush()   # <---
    except (KeyboardInterrupt, SystemExit): #pragma: no cover
        raise
    except:
        self.handleError(record)

I wouldn't really mind about the performance of logging, at least not before profiling and discovering that it is a bottleneck. Anyway you can always create a Handler subclass that doesn't perform flush at every call to emit(even though you will risk to lose a lot of logs if a bad exception occurs/the interpreter crashes).

Solution 2 - Python

In order to buffer logging messages and output them conditionally, you can use a MemoryHandler to decorate the target Handler (i.e. FileHandler or StreamHandler). The signature is logging.handlers.MemoryHandler(capacity, flushLevel=ERROR, target=None, flushOnClose=True) with the argument capacity specifying the buffer size (number of records buffered).

file_handler = logging.FileHandler('test.log', mode='a')
memory_handler = MemoryHandler(capacity, flushLevel=logging.ERROR, target=file_handler, flushOnClose=True)
logger.addHandler(memory_handler)

You can check the source code for the MemoryHandler:

def shouldFlush(self, record):
	"""
	Check for buffer full or a record at the flushLevel or higher.
	"""
	return (len(self.buffer) >= self.capacity) or \
			(record.levelno >= self.flushLevel)

def flush(self):
	"""
	For a MemoryHandler, flushing means just sending the buffered
	records to the target, if there is one. Override if you want
	different behaviour.

	The record buffer is also cleared by this operation.
	"""
	self.acquire()
	try:
		if self.target:
			for record in self.buffer:
				self.target.handle(record)
			self.buffer = []
	finally:
		self.release()

def close(self):
	"""
	Flush, if appropriately configured, set the target to None and lose the
	buffer.
	"""
	try:
		if self.flushOnClose:
			self.flush()
	finally:
		self.acquire()
		try:
			self.target = None
			BufferingHandler.close(self)
		finally:
			self.release()

For more details, have a look at the corresponding section of python logging cookbook.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionVincent XueView Question on Stackoverflow
Solution 1 - PythonBakuriuView Answer on Stackoverflow
Solution 2 - PythonfakeProgrammerView Answer on Stackoverflow