1200字范文,内容丰富有趣,写作的好帮手!
1200字范文 > MySQL 数据库慢查询日志分析脚本

MySQL 数据库慢查询日志分析脚本

时间:2019-04-04 11:40:59

相关推荐

MySQL 数据库慢查询日志分析脚本

这个脚本是基于pt-query-digest做的日志分析脚本,变成可视化的格式。

目录结构是

./mysql_data/log

./mysql_data/log/tmp

./slow_query

# coding = gbk__author__ = 'T_two'import datetimeimport osIP = '111'dirname = os.path.dirname(os.path.abspath(__file__))# 解析后的目录名slow_query= os.path.join(dirname, 'slow_query')# pt-query-digest前的目录的mysql_data = os.path.join(os.path.join(dirname, 'mysql_data'), 'log')# pt-query-digest后的目录的tmp = os.path.join(mysql_data, 'tmp')def getYesterday():today=datetime.date.today()yesterday = str(today - datetime.timedelta(days=1))return yesterdaydef getLog(yes_time, slow_query):# 对日志进行pt-query-digest分析before_name = yes_time.replace('-', '') + '-' + 'slow-query.log'# pt-query-digest之前的日志 b_filenameb_filename = os.path.join(mysql_data, before_name)# print(b_filename)# pt-query-digest之后的日志 a_filenameafter_name = yes_time.replace('-', '') + '-' + IP + '-' + 'slow-query.log'a_filename = os.path.join(tmp, after_name)# print(a_filename)# 最终格式化的日志 e_filenameend_name = IP + '-slow-log-' + yes_time + '.txt'e_filename = os.path.join(slow_query, end_name)#print(e_filename)return b_filename,a_filename,e_filenamedef getSlowquery(b_filename,a_filename,e_filename):print('File format starting...')#os.system('pt-query-digest '+ b_filename + '>' + a_filename)a_slow_query = open(a_filename, 'r', encoding = 'utf8')e_slow_query = open(e_filename, 'w', encoding = 'utf8')_line = ''line = a_slow_query.readlines()[20:] # 对文件切片,去除不需要的前20行。for line in line:line = line.strip()# 提取需要的行if line.startswith('#') and '# Hosts' not in line and '# Users' not in line and '# Databases' not in line and 'byte' not in line \and '# Count' not in line and '# Exec time' not in line :passelif line == '':passelse:# 序列号if '# Query' in line: line = ('\nNO.%s' % line.split()[2])# 执行次数elif '# Count' in line: line = ('执行次数: %s' % line.split()[3])#执行时间elif '# Exec time' in line: line = ('执行时间 Total: %s min: %s max: %s' % (line.split()[4],line.split()[5],line.split()[6],))# DBelif '# Databases' in line: line = ('库名: %s' % line.split()[2])# 源IPelif '# Host' in line:line = ('源IP: %s' % line.split()[2])# 用户名elif '# User' in line: line = ('用户名: %s' % line.split()[2])_line = _line + line + '\n'e_slow_query.write(_line)a_slow_query.close()e_slow_query.close()# 将文件拷贝到web目录下os.system('cp ' + e_filename + ' ' + web_dir)# 删除10天之前的数据os.system('find ' + str(slow_query) + ' -mtime +10 | xargs rm -rf ')os.system('find ' + mysql_data + ' -mtime +10 | xargs rm -rf ')os.system('find ' + tmp + ' -mtime +10 | xargs rm -rf ')print ('File format end...')if __name__ == '__main__':yes_time = getYesterday()b_filename,a_filename, e_filename = getLog(yes_time, slow_query)getSlowquery(b_filename,a_filename,e_filename)

解析之后显示的结果:

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。