Logging¶
The following are classes and functions added to help setting up logging in children process and to log execution times into custom log files.
- aca_view.logging.time_logging(name='')[source]¶
Context manager to log execution times.
It is used like this:
with aca_view.config.time_logging("my_id"): pass
- Parameters:
name – str the string ID that will appear in the log
- Returns:
- aca_view.logging.log_time(name='')[source]¶
Decorator for logging function execution time.
It is used like this:
@aca_view.config.log_time("my_id") def my_function(): pass
- Parameters:
name – str the string ID that will appear in the log
- Returns:
- class aca_view.logging.LoggingSetup(queue=None)[source]¶
Context manager to configure logging.
The rationale for this context manager is that we need to be able to handle logging to one file from multiple processes. To accomplish this, this class starts a thread that listens to messages from a queue. This queue needs to be passed to secondary processes. Otherwise, the different processes will step on each other’s toes when writing to the file. We need to make sure this thread is stopped at the end.
Internally, this class uses the dictionary in aca_view.config.SETTINGS[‘logging’] to configure the logging module using logging.config.dictConfig. In secondary processes, all handlers are removed and all messages go through the queue.
While setting up, this class checks the settings for common issues and “fixes” them so execution will not stop.
In tear_down, only the thread is stopped. Other changes to logging configuration remain.
Normally one would use this at the very top-level of a script. It would look like this:
import logging from aca_view.config import LoggingSetup def main(): with LoggingSetup(): logging.getLogger('aca_view').info('some message') if __name__ == "__main__": main()
To use it in a child process, one must pass a queue to the process, and this queue needs to be passed to LoggingSetup. Something like this:
import logging from aca_view.logging import LoggingSetup from multiprocessing import Process def run(logging_queue): import time with LoggingSetup(logging_queue): logging.getLogger('aca_view').info('some message') time.sleep(2) logging.getLogger('aca_view').info('another message') with LoggingSetup(): process = Process(target=run, args=(LoggingSetup.queue,)) process.start() process.join() # if this is not here, all messages after exiting the context are lost
If the context manager functionality does not suit the application (e.g.: child processes continue logging after the end of the main function). One can call the setup/tear_down methods directly:
import logging from aca_view.logging import LoggingSetup def main(): process = Process(target=run, args=(LoggingSetup.queue,)) process.start() logging.getLogger('aca_view').info('some message') process.join() def run(logging_queue): import time with LoggingSetup(logging_queue): logging.getLogger('aca_view').info('some message') time.sleep(2) logging.getLogger('aca_view').info('another message') if __name__ == "__main__": logging_setup = LoggingSetup() logging_setup.setup() main() logging_setup.tear_down()