Using logging in multiple modules


Python Problem Overview

I have a small python project that has the following structure -

 -- pkg01
 -- pkg02
 -- logging.conf

I plan to use the default logging module to print messages to stdout and a log file. To use the logging module, some initialization is required -

import logging.config

logger = logging.getLogger('pyApp')'testing')

At present, I perform this initialization in every module before I start logging messages. Is it possible to perform this initialization only once in one place such that the same settings are reused by logging all over the project?

Python Solutions

Solution 1 - Python

Best practice is, in each module, to have a logger defined like this:

import logging
logger = logging.getLogger(__name__)

near the top of the module, and then in other code in the module do e.g.

logger.debug('My message with %s', 'variable data')

If you need to subdivide logging activity inside a module, use e.g.

loggerA = logging.getLogger(__name__ + '.A')
loggerB = logging.getLogger(__name__ + '.B')

and log to loggerA and loggerB as appropriate.

In your main program or programs, do e.g.:

def main():
    "your program code"

if __name__ == '__main__':
    import logging.config


def main():
    import logging.config
    # your program code

if __name__ == '__main__':

See here for logging from multiple modules, and here for logging configuration for code which will be used as a library module by other code.

Update: When calling fileConfig(), you may want to specify disable_existing_loggers=False if you're using Python 2.6 or later (see the docs for more information). The default value is True for backward compatibility, which causes all existing loggers to be disabled by fileConfig() unless they or their ancestor are explicitly named in the configuration. With the value set to False, existing loggers are left alone. If using Python 2.7/Python 3.2 or later, you may wish to consider the dictConfig() API which is better than fileConfig() as it gives more control over the configuration.

Solution 2 - Python

Actually every logger is a child of the parent's package logger (i.e. package.subpackage.module inherits configuration from package.subpackage), so all you need to do is just to configure the root logger. This can be achieved by logging.config.fileConfig (your own config for loggers) or logging.basicConfig (sets the root logger). Setup logging in your entry module ( or whatever you want to run, for example works as well)

using basicConfig:

# package/
import logging
import sys

logging.basicConfig(stream=sys.stdout, level=logging.INFO)

using fileConfig:

# package/
import logging
import logging.config


and then create every logger using:

# package/
# or
# package/subpackage/
import logging
log = logging.getLogger(__name__)"Hello logging!")

For more information see Advanced Logging Tutorial.

Solution 3 - Python

A simple way of using one instance of logging library in multiple modules for me was following solution:

import logging

logger = logging
logger.basicConfig(format='%(asctime)s - %(message)s', level=logging.INFO)

###Other files

from base_logger import logger

if __name__ == '__main__':"This is an info message")

Solution 4 - Python

I always do it as below.

Use a single python file to config my log as singleton pattern which named ''


import logging.config

def singleton(cls):
    instances = {}
    def get_instance():
        if cls not in instances:
            instances[cls] = cls()
        return instances[cls]
    return get_instance()

class Logger():
    def __init__(self):
        self.logr = logging.getLogger('root')

In another module, just import the config.

from log_conf import Logger"Hello World")

This is a singleton pattern to log, simply and efficiently.

Solution 5 - Python

Throwing in another solution.

In my module's I have something like:

# mymodule/
import logging

def get_module_logger(mod_name):
  logger = logging.getLogger(mod_name)
  handler = logging.StreamHandler()
  formatter = logging.Formatter(
        '%(asctime)s %(name)-12s %(levelname)-8s %(message)s')
  return logger

Then in each module I need a logger, I do:

# mymodule/
from [modname] import get_module_logger
logger = get_module_logger(__name__)

When the logs are missed, you can differentiate their source by the module they came from.

Solution 6 - Python

Several of these answers suggest that at the top of a module you you do

import logging
logger = logging.getLogger(__name__)

It is my understanding that this is considered very bad practice. The reason is that the file config will disable all existing loggers by default. E.g.

import logging

logger = logging.getLogger(__name__)

def foo():'Hi, foo')

class Bar(object):
    def bar(self):'Hi, bar')

And in your main module :

import logging

# load my module - this now configures the logger
import my_module

# This will now disable the logger in my module by default, [see the docs][1] 
bar = my_module.Bar()

Now the log specified in logging.ini will be empty, as the existing logger was disabled by fileconfig call.

While is is certainly possible to get around this (disable_existing_Loggers=False), realistically many clients of your library will not know about this behavior, and will not receive your logs. Make it easy for your clients by always calling logging.getLogger locally. Hat Tip : I learned about this behavior from Victor Lin's Website.

So good practice is instead to always call logging.getLogger locally. E.g.

import logging

logger = logging.getLogger(__name__)

def foo():
    logging.getLogger(__name__).info('Hi, foo')

class Bar(object):
    def bar(self):
        logging.getLogger(__name__).info('Hi, bar')    

Also, if you use fileconfig in your main, set disable_existing_loggers=False, just in case your library designers use module level logger instances.

Solution 7 - Python

You could also come up with something like this!

def get_logger(name=None):
	default = "__app__"
	formatter = logging.Formatter('%(levelname)s: %(asctime)s %(funcName)s(%(lineno)d) -- %(message)s',
                              datefmt='%Y-%m-%d %H:%M:%S')
	log_map = {"__app__": "app.log", "__basic_log__": "file1.log", "__advance_log__": "file2.log"}
	if name:
		logger = logging.getLogger(name)
		logger = logging.getLogger(default)
	fh = logging.FileHandler(log_map[name])
	return logger

Now you could use multiple loggers in same module and across whole project if the above is defined in a separate module and imported in other modules were logging is required.

b=get_logger("__basic_log__")"Starting logging!")
b.debug("Debug Mode")

Solution 8 - Python

@Yarkee's solution seemed better. I would like to add somemore to it -

class Singleton(type):
	_instances = {}

	def __call__(cls, *args, **kwargs):
		if cls not in cls._instances.keys():
			cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
		return cls._instances[cls]

class LoggerManager(object):
	__metaclass__ = Singleton

	_loggers = {}

	def __init__(self, *args, **kwargs):

	def getLogger(name=None):
		if not name:
			return logging.getLogger()
		elif name not in LoggerManager._loggers.keys():
			LoggerManager._loggers[name] = logging.getLogger(str(name))
		return LoggerManager._loggers[name]    


So LoggerManager can be a pluggable to the entire application. Hope it makes sense and value.

Solution 9 - Python

I would like to add my solution (which is based on logging cookbook and other articles and suggestions from this thread. However it took me quite a while to figure out, why it wasn't immediately working how I expected. So I created a little test project to learn how logging is working.

Since I have figured it out, I wanted to share my solution, maybe it can be of help to someone.

I know some of my code might not be best practice, but I am still learning. I left the print() functions in there, as I used them, while logging was not working as expected. Those are removed in my other application. Also I welcome any feedback on any parts of the code or structure.

my_log_test project structure (cloned/simplified from another project I work on)

├── common
│   ├──
├── pkg1
│   ├──
│   └──
└── pkg2


A few things different or that I have not seen explicitly mentioned in the combination I use:

  • the main module is daemon.pywhich is called by
  • I want to be able to call the modules and seperately while in development/testing
  • At this point I did not want to use basicConfig() or FileConfig() but keep it like in the logging cookbook

So basically, that means, I need to initialize the root logger in (always) and in the modules and (only when calling them directly).

To make this init in several modules easier, I created which does, what is described in the cookbook.

My mistakes

Beforehand, my mistake in that module was to init the logger with logger = logging.getLogger(__name__) (module logger) instead of using logger = logging.getLogger() (to get the root logger).

The first problem was, that when called from the logger's namespace was set to my_log_test.common.my_logger. The module logger in with an "unmatching" namespace my_log_test.pkg1.mod1 could hence not attach to the other logger and I would see no log output from mod1.

The second "problem" was, that my main program is in and not in But after all not a real problem for me, but it added to the namespace confusion.

Working solution

This is from the cookbook but in a separate module. I also added a logger_cleanup function that I can call from daemon, to remove logs older than x days.

from datetime import datetime
import time
import os

## Init logging start 
import logging
import logging.handlers

def logger_init():
    print("print in my_logger.logger_init()")
    print("print __name__: " +__name__)
    path = "log/"
    filename = "my_log_test.log"

    ## get logger
    #logger = logging.getLogger(__name__) ## this was my mistake, to init a module logger here
    logger = logging.getLogger() ## root logger

    # File handler
    logfilename ="%Y%m%d_%H%M%S") + f"_{filename}"
    file = logging.handlers.TimedRotatingFileHandler(f"{path}{logfilename}", when="midnight", interval=1)
    #fileformat = logging.Formatter("%(asctime)s [%(levelname)s] %(message)s")
    fileformat = logging.Formatter("%(asctime)s [%(levelname)s]: %(name)s: %(message)s")

    # Stream handler
    stream = logging.StreamHandler()
    #streamformat = logging.Formatter("%(asctime)s [%(levelname)s:%(module)s] %(message)s")
    streamformat = logging.Formatter("%(asctime)s [%(levelname)s]: %(name)s: %(message)s")

    # Adding all handlers to the logs

def logger_cleanup(path, days_to_keep):
    lclogger = logging.getLogger(__name__)
    logpath = f"{path}"
    now = time.time()
    for filename in os.listdir(logpath):
        filestamp = os.stat(os.path.join(logpath, filename)).st_mtime
        filecompare = now - days_to_keep * 86400
        if  filestamp < filecompare:
  "Delete old log " + filename)
                os.remove(os.path.join(logpath, filename))
            except Exception as e:

to run (through use python3 -m my_log_test

from  my_log_test import daemon

if __name__ == '__main__':
    print("print in")

to run (directly) use python3 -m my_log_test.daemon

from datetime import datetime
import time
import logging
import my_log_test.pkg1.mod1 as mod1
import my_log_test.pkg2.mod2 as mod2

## init ROOT logger from my_logger.logger_init()
from my_log_test.common.my_logger import logger_init
logger_init() ## init root logger
logger = logging.getLogger(__name__) ## module logger

def run():
    print("print in")
    print("print __name__: " +__name__)"Start daemon")
    loop_count = 1
    while True:"loop_count: {loop_count}")"do stuff from pkg1")
        mod1.do1()"finished stuff from pkg1")"do stuff from pkg2")
        mod2.do2()"finished stuff from pkg2")"Waiting a bit...")

if __name__ == '__main__':
        print("print in if __name__ == '__main__'")"running as main")
    except KeyboardInterrupt as e:"Program aborted by user")
    except Exception as e:

To run (directly) use python3 -m my_log_test.pkg1.mod1

import logging
# mod1_logger = logging.getLogger(__name__)
mod1_logger = logging.getLogger("my_log_test.daemon.pkg1.mod1") ## for testing, namespace set manually

def do1():
    print("print in mod1.do1()")
    print("print __name__: " +__name__)"Doing someting in pkg1.do1()")

if __name__ == '__main__':
    ## Also enable this pkg to be run directly while in development with
    ## python3 -m my_log_test.pkg1.mod1

    ## init root logger
    from my_log_test.common.my_logger import logger_init
    logger_init() ## init root logger

    print("print in if __name__ == '__main__'")"Running as main")

To run (directly) use python3 -m my_log_test.pkg2.mod2

import logging
logger = logging.getLogger(__name__)

def do2():
    print("print in pkg2.do2()")
    print("print __name__: " +__name__) # setting namespace through __name__"Doing someting in pkg2.do2()")

if __name__ == '__main__':
    ## Also enable this pkg to be run directly while in development with
    ## python3 -m my_log_test.pkg2.mod2

    ## init root logger
    from my_log_test.common.my_logger import logger_init
    logger_init() ## init root logger

    print("print in if __name__ == '__main__'")"Running as main")

Happy if it helps. Happy to receive feedback as well!

Solution 10 - Python

There are several answers. i ended up with a similar yet different solution that makes sense to me, maybe it will make sense to you as well. My main objective was to be able to pass logs to handlers by their level (debug level logs to the console, warnings and above to files):

from flask import Flask
import logging
from logging.handlers import RotatingFileHandler

app = Flask(__name__)

# make default logger output everything to the console

rotating_file_handler = RotatingFileHandler(filename="logs.log")


created a nice util file named

import logging

def get_logger(name):
    return logging.getLogger("" + name)

the is a hardcoded value in flask. the application logger is always starting with as its the module's name.

now, in each module, i'm able to use it in the following mode:

from logger import get_logger
logger = get_logger(__name__)"new log")

This will create a new log for "app.flask.MODULE_NAME" with minimum effort.

Solution 11 - Python

The best practice would be to create a module separately which has only one method whose task we be to give a logger handler to the the calling method. Save this file as

import logger, logging

def getlogger():
	# logger
	logger = logging.getLogger(__name__)
	# create console handler and set level to debug
	#ch = logging.StreamHandler()
	ch = logging.FileHandler(r'log.txt')
	# create formatter
	formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
	# add formatter to ch
	# add ch to logger
	return logger

Now call the getlogger() method whenever logger handler is needed.

from m_logger import getlogger
logger = getlogger()'My mssg')

Solution 12 - Python

New to python so I don't know if this is advisable, but it works great for not re-writing boilerplate.

Your project must have an so it can be loaded as a module

# Put this in your module's
import logging.config
import sys

# I used this dictionary test, you would put:
# logging.config.fileConfig('logging.conf')
# The "" entry in loggers is the root logger, tutorials always 
# use "root" but I can't get that to work
    "version": 1,
    "formatters": {
        "default": {
            "format": "%(asctime)s %(levelname)s %(name)s %(message)s"
    "handlers": {
        "console": {
            "level": 'DEBUG',
            "class": "logging.StreamHandler",
            "stream": "ext://sys.stdout"
    "loggers": {
        "": {
            "level": "DEBUG",
            "handlers": ["console"]

def logger():
    # Get the name from the caller of this function
    return logging.getLogger(sys._getframe(1).f_globals['__name__'])

sys._getframe(1) suggestion comes from here

Then to use your logger in any other file:

from [your module name here] import logger



  1. You must run your files as modules, otherwise import [your module] won't work:
    • python -m [your module name].[your filename without .py]
  2. The name of the logger for the entry point of your program will be __main__, but any solution using __name__ will have that issue.


All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionQuest MongerView Question on Stackoverflow
Solution 1 - PythonVinay SajipView Answer on Stackoverflow
Solution 2 - PythonStan ProkopView Answer on Stackoverflow
Solution 3 - PythonAlex JoligView Answer on Stackoverflow
Solution 4 - PythonYarkeeView Answer on Stackoverflow
Solution 5 - PythonTommyView Answer on Stackoverflow
Solution 6 - Pythonphil_20686View Answer on Stackoverflow
Solution 7 - PythondeeshankView Answer on Stackoverflow
Solution 8 - PythondeeshankView Answer on Stackoverflow
Solution 9 - PythonPhilView Answer on Stackoverflow
Solution 10 - PythonBen YitzhakiView Answer on Stackoverflow
Solution 11 - PythonMousam SinghView Answer on Stackoverflow
Solution 12 - PythonnpjohnsView Answer on Stackoverflow