I needed to check the performance of a database after some changes. This gets me some raw numbers on the performance of the slow queries log in MySQL:
tail -n 10000 log-slow-queries.log|grep Query|awk '{print $3}'|awk -F . '{print $1}'|sort -n|uniq -c|awk '{print $2" "$1}'
It outputs seconds taken, and number of times that amount has shown up in the logs. Lets break it down:
This looks at a line like this:
# Query_time: 164.308245 Lock_time: 0.000135 Rows_sent: 5424030 Rows_examined: 5452911 Rows_affected: 5424030 Rows_read: 5452911
tail -n 10000 log-slow-queries.log – look at the last 10,000 lines of the file
grep Query – only find lines that have Query in them
awk ‘{print $3}’ – print the third word on the found line (awk defaults to using spaces as delimiters), so it ignores the “#” and “Query_time:” and returns 164.308245
awk -F . ‘{print $1}’ – Now I use a period as a delimiter. I’m not picky about milliseconds, so I discard them. This leaves me with 164
sort -n – sort all the numbers numerically
uniq -c – count the number of times a number shows up.
awk ‘{print $2″ “$1}’ – this reverses the order of the output from uniq -c, so instead of “10 120” meaning 10 counts of 120 seconds, it shows as 120 10
The end result of all this is a table that looks like this:
1 1317 2 49 3 9 5 1 9 1 10 1 17 1 18 1 24 1 39 1 55 1 86 1 164 1 243 1 253 1 265 1 317 2 1126 1 1997 1 2723 1
I can then look at the slow query log and search for 2723 to see what query took 2,723 seconds to complete.