Suppose an automated scripting tool is to be developed and the project is structured as follows.Common
this onepackage
is the realization of the framework functionality thatScripts
directory is the test case script we wrote (please ignore other unrelated directories).
Our requirements for the logging functionality are as follows:
1 In order to facilitate the view of the log, each script corresponds to a log file, the log file is named after the name of the script.
2 The log path and the size of the log saved by each script can be set, for example, set to 5MB, then the oldest log will be automatically overwritten after it exceeds
3 logging functionality should be easy to use, reducing the coupling with the framework business functions
Now to analyze the above needs one by one.
1 To implement one log file per scriptThe key issue here is how to get the name of the use case script in the logging module, which is required to generate the log file based on the name of the use case script.
The common methods to get the filename are:()
, [0], __file__,
Take a look at the various roles:
Start by adding a file (assuming) in which the following code is written:
Then in another file (assuming(used in a newspaper or magazine)
import test
and then call thefunc
Methods:
(of a computer) run, results for:
It can be seen that()
gets the directory where the script was executed. [0]
is the absolute pathname of the script to be executed. __file__
is the absolute pathname of the file where the executed code is located.
It's clear now that we should use[0]
to get the name of the executing script, which requires a bit of processing since it gets an absolute path:[0].split('/')[-1].split('.')[0]
2 Log capacity issuesTo achieve automatic overwriting of the oldest logs when capacity is exceeded, use thelogging
hit the nail on the headRotatingFileHandler
You can set the size of the log file and the number of backups.
So where is the log path and capacity configuration placed? Let the user directly modify theRotatingFileHandler
parameter is obviously not good, it is better not to let the user modify the framework file, the user can just call the interface to write their own scripts.
The scheme used here is to write the configuration information to a file, XML file is more suitable to be used as a configuration file, the user can make the configuration by modifying the XML file, the logging module reads the parameters from the XML file.
Here, for the convenience of putting the XML file into theCommon
Below, the name, content for:
<?xml version="1.0" encoding="utf-8"?> <config> <!-- Log Save Path --> <logpath>E:\PythonLog</logpath> <!-- Log file size for each script,work unit (one's workplace)MB --> <logsize>8</logsize> <!-- Number of log files saved per script --> <lognum>3</lognum> </config>
Reads the contents of an XML file using thelxml
The library is very simple and the code is given later.
3 The logging function should be easy to useTo reduce the coupling of business functions with the framework, it is best to encapsulate the logging function and provide only the interface to record logs.
Logging interface using the form of class methods can meet the above requirements , the user only needs to call the logging interface through the class , call anywhere , easy to use , and do not need to define the class instance , no coupling with the framework business .
With the above analysis, let's implement the logging module.
Since the logging functionality is also part of the foundation of the framework, we have placed the logging module also in theCommon
this onepackage
inCommon
newbuildingfile with the following code:
# coding: utf-8 from lxml import etree import import logging import os import sys # Provide logging capabilities class logger: # First read the configuration data from the XML file # Since it's placed in the same directory as the current file, get the directory of the XML file via __file__ and then splice it into an absolute path # The lxml library is utilized here to parse the XML. root = (((__file__), '')).getroot() # Read log file save path logpath = ('logpath').text # Read log file capacity, convert to bytes logsize = 1024*1024*int(('logsize').text) # of log file saves read lognum = int(('lognum').text) # Log file name: the name of the use case script, combined with the log save path, to get the absolute path of the log file logname = (logpath, [0].split('/')[-1].split('.')[0]) # Initialize the logger log = () # Log format, which can be set as desired fmt = ('[%(asctime)s][%(filename)s][line:%(lineno)d][%(levelname)s] %(message)s', '%Y-%m-%d %H:%M:%S') # Log output to a file, this uses the log name, size, and number of saves obtained above. handle1 = (logname, maxBytes=logsize, backupCount=lognum) (fmt) # Simultaneous output to screen for easy implementation observation handle2 = (stream=) (fmt) (handle1) (handle2) # Set the log basic, here set to INFO, means only INFO level and above will be printed () # Logging interface, the user only need to call the interface here, here only locate the INFO, WARNING, ERROR three levels of logging, according to the need to define more interfaces @classmethod def info(cls, msg): (msg) return @classmethod def warning(cls, msg): (msg) return @classmethod def error(cls, msg): (msg) return
To test this, in the scriptscript1
cap (a poem)script2
The following code is written in each of them:
from import * ('This is info') ('This is warning') ('This is error')
Run the two scripts separately and the console output is:
The log file generated:
File contents:
Well, now whether in other files in the framework, or in the user script, you can easily log through the logger class logging interface. The above is how to use Python script logging features all, I hope this article can help you learn python.