logging with filters

PythonLogging

Python Problem Overview


I'm using Logging (import logging) to log messages.

Within 1 single module, I am logging messages at the debug level my_logger.debug('msg');

Some of these debug messages come from function_a() and others from function_b(); I'd like to be able to enable/disable logging based on whether they come from a or from b;

I'm guessing that I have to use Logging's filtering mechanism.

Can someone show me how the code below would need to be instrumented to do what I want?

import logging
logger = logging.getLogger( "module_name" )

def function_a( ... ):
    logger.debug( "a message" )

def function_b( ... ):
    logger.debug( "another message" )

if __name__ == "__main__":
    logging.basicConfig( stream=sys.stderr, level=logging.DEBUG )

    #don't want function_a()'s noise -> ....
    #somehow filter-out function_a's logging
    function_a()

    #don't want function_b()'s noise -> ....
    #somehow filter-out function_b's logging
    function_b()

If I scaled this simple example to more modules and more funcs per module, I'd be concerned about lots of loggers;

Can I keep it down to 1 logger per module? Note that the log messages are "structured", i.e. if the function(s) logging it are doing some parsing work, they all contain a prefix logger.debug("parsing: xxx") - can I somehow with a single line just shut-off all "parsing" messages (regardless of the module/function emitting the message?)

Python Solutions


Solution 1 - Python

Just implement a subclass of logging.Filter: <http://docs.python.org/library/logging.html#filter-objects>;. It will have one method, filter(record), that examines the log record and returns True to log it or False to discard it. Then you can install the filter on either a Logger or a Handler by calling its addFilter(filter) method.

Example:

class NoParsingFilter(logging.Filter):
    def filter(self, record):
        return not record.getMessage().startswith('parsing')

logger.addFilter(NoParsingFilter())

Or something like that, anyway.

Solution 2 - Python

Do not use global. It's an accident waiting to happen.

You can give your loggers any "."-separated names that are meaningful to you.

You can control them as a hierarchy. If you have loggers named a.b.c and a.b.d, you can check the logging level for a.b and alter both loggers.

You can have any number of loggers -- they're inexpensive.

The most common design pattern is one logger per module. See https://stackoverflow.com/questions/401277/naming-python-loggers

Do this.

import logging

logger= logging.getLogger( "module_name" )
logger_a = logger.getLogger( "module_name.function_a" )
logger_b = logger.getLogger( "module_name.function_b" )

def function_a( ... ):
    logger_a.debug( "a message" )

def function_b( ... ):
    logger_b.debug( "another message" )

if __name__ == "__main__":
    logging.basicConfig( stream=sys.stderr, level=logging.DEBUG )
    logger_a.setLevel( logging.DEBUG )
    logger_b.setLevel( logging.WARN )

    ... etc ...

Solution 3 - Python

I found a simpler way using functions in your main script:

# rm 2to3 messages
def filter_grammar_messages(record):
    if record.funcName == 'load_grammar':
        return False
    return True

def filter_import_messages(record):
    if record.funcName == 'init' and record.msg.startswith('Importing '):
        return False
    return True

logging.getLogger().addFilter(filter_grammar_messages)  # root
logging.getLogger('PIL.Image').addFilter(filter_import_messages)

Solution 4 - Python

I've found a bit easier way how to filter default logging configuration on following problem using sshtunel module, supressing INFO level messages.

Default reporting with first 2 undesired records looked as follows:

2020-11-10 21:53:28,114  INFO       paramiko.transport: Connected (version 2.0, client OpenSSH_7.9p1)
2020-11-10 21:53:28,307  INFO       paramiko.transport: Authentication (password) successful!
2020-11-10 21:53:28,441  INFO       |-->QuerySSH: Query execution successful.

Logger configuration update:

logging.basicConfig(
            level=logging.INFO,
            format='%(asctime)s  %(levelname)-10s %(name)s: %(message)s',
            handlers=[
                logging.StreamHandler(),
                logging.FileHandler(self.logging_handler)
            ]
        )

# Filter paramiko.transport debug and info from basic logging configuration
logger_descope = logging.getLogger('paramiko.transport')
logger_descope.setLevel(logging.WARN)

And result I am happy with looks like this:

2020-11-10 22:00:48,755  INFO       |-->QuerySSH: Query execution successful.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
Questionjd.View Question on Stackoverflow
Solution 1 - PythonDavid ZView Answer on Stackoverflow
Solution 2 - PythonS.LottView Answer on Stackoverflow
Solution 3 - PythonGringo SuaveView Answer on Stackoverflow
Solution 4 - PythonMirrisView Answer on Stackoverflow