SoFunction
Updated on 2024-12-10

Python script to implement Zabbix multi-line log monitoring process analysis

Through the use of zabbix log monitoring I found a problem such as oracle's logs have reported errors, usually do not go to manually clean up so that when the second time when the log is written in the zabbix mechanism is to go back to check all the logs, in this case, has been alerted to the previous error logs, and will be checked, so there will be a repeat of the alarm, and zabbix logging monitoring can only be read to match the current line of keyword data, the feeling is not very flexible, for example, I want to match the keyword and then the current keyword under the N line to match another keyword at this time it is more troublesome, here to recommend an effective and convenient way to solve the problem.

Log monitoring through Python scripts Requirements 1 Record scripts to check the location of the log, to avoid the next time the script is triggered to repeat the alarm 2 keyword matching support regular 3 support for multiple keyword query, for example, the first keyword to match to when the keyword in the next line to match the second keyword Specific parameter formats
python3 /u03/ '(ORA-|REEOR),(04030|02011)' 2

The first parameter is the log path The second parameter is the keyword The third parameter is the keyword that matches the first expression and then goes to the N(2) line to match the second keyword (04030|02011) The specific script implementation is as follows

import os
import sys
logtxt = ""
def read_txt(files, start_line):
	data = []
("")
with open(str(files) + "", "r",
		encoding = 'UTF-8') as f:
	for line in ():
	line = ('\n')# Remove line breaks from each element of the list
(line)
# Record the number of rows this time
wirte_log(len(data) - 1)
if len(data) > start_line:
	return data[start_line - 1: ]
else :
	print("The number of starting lines is greater than the total number of lines in the text file!")
def wirte_log(lines):
	global logtxt
with open(logtxt, "w") as file: # "w" stands for overwrite content on every run
(str(lines))
def read_log():
	global logtxt
if not (logtxt):
	with open(logtxt, "w") as file: # "w" stands for overwrite content on every run
(str(1))
with open(logtxt + "", "r", encoding =
		'UTF-8') as f:
	s_lines = ()
print("From para." + str(s_lines[0]) + "The line begins.")
return s_lines[0]
def deal_read_log(files, keyword,
		interval_line):
	keywords = ("(", "").replace(
		")", "").replace("'", "").replace('"',
		'"').split(',')
start_keywords = keywords[0].split("|")
end_keywords = keywords[1].split("|")
start_line = read_log()
lines_data = read_txt(files, int(
	start_line))
for_line = 1
while (for_line < len(lines_data)):
	#print(for_line)
# print(lines_data[for_line])
#
if end_keywords in lines_data[for_line]:
	#print(lines_data[for_line])
# print("-------------------")
# for_line = for_line + 1
#
else :
	isexist = 0
for sk in start_keywords:
	if sk in lines_data[for_line]:
	isexist = 1
break;
if isexist == 1:
	#if start_keywords[0] in lines_data[
		for_line] or start_keywords[1] in
	lines_data[for_line]:
	# Current line has end_keywords
isexist2 = 0
for sk in end_keywords:
	if sk in lines_data[for_line]:
	isexist2 = 1
break;
if isexist2 == 1:
	#print("Number of lines = " + str(start_line - 1 +
		for_line) + "-" + str(start_line - 1 +
		for_line))
print(lines_data[for_line])
else :
	#There are no end_keywords on the current line. Go to the next interval_line to find them.
# Marks the current line
flag_line = for_line
count = 1
for_line = for_line + 1
while (for_line < len(lines_data)):
	isexist3 = 0
for sk in end_keywords:
	if sk in lines_data[for_line]:
	isexist3 = 1
break;
if isexist3 == 1:
	#print("Number of lines = " + str(start_line - 1 +
		flag_line) + "-" + str(start_line -
		1 + for_line))
for prin in range(flag_line, for_line +
		1):
	print(lines_data[prin])
break;
for_line = for_line + 1
if count == int(interval_line):
	break;
count = count + 1
for_line = for_line - 1
for_line = for_line + 1
if name == 'main':
	files = [1]
if '.log' in files:
	logtxt = (".log",
		"_log.txt")
else :
	logtxt = (".txt",
		"_log.txt")
# files = ""
keywords = [2]
# keywords = "'((04030|04000),ORA-)'"
# of up and down associated rows
interval_line = int([3])
# interval_line = 10
deal_read_log(files, keywords,
	interval_line)

The next step is to add monitoring

Add the UserParameter to the agent's conf file.


This is the end of the monitoring.

This is the whole content of this article.