当前位置:网站首页>Shell analysis server log command collection
Shell analysis server log command collection
2022-07-01 04:37:00 【Moshow Zhengkai】
1、 See how many IP visit :
awk ‘{print $1}’ log_file|sort|uniq|wc -l
2、 View the number of times a page has been visited :
grep “/index.php” log_file | wc -l
3、 View each IP How many pages visited :
awk ‘{++S[$1]} END {for (a in S) print a,S[a]}’ log_file > log.txt
sort -n -t ’ ’ -k 2 log.txt coordination sort Further sorting
4、 Each one IP Number of pages visited sorted from small to large :
awk ‘{++S[$1]} END {for (a in S) print S[a],a}’ log_file | sort -n
5、 View a IP Which pages were visited :
grep ^111.111.111.111 log_file| awk ‘{print $1,$7}’
6、 Remove search engine statistics page :
awk ‘{print $12,$1}’ log_file | grep ^"Mozilla | awk ‘{print $2}’ |sort | uniq | wc -l
7、 see 2015 year 8 month 16 Japan 14 How many in an hour IP visit :
awk ‘{print $4,$1}’ log_file | grep 16/Aug/2015:14 | awk ‘{print $2}’| sort | uniq | wc -l
8、 View top ten visits ip Address
awk ‘{print $1}’ |sort|uniq -c|sort -nr |head -10 access_log
uniq -c It's equivalent to grouping statistics and putting statistics first
cat access.log|awk '{print KaTeX parse error: Expected 'EOF', got '}' at position 2: 1}̲'|sort|uniq -c|…(11)]+=1}; END {for(url in counts) print counts[url], url}
9、 Most visited 10 Files or pages
cat log_file|awk ‘{print $11}’|sort|uniq -c|sort -nr | head -10
cat log_file|awk ‘{print $11}’|sort|uniq -c|sort -nr|head -20
awk ‘{print $1}’ log_file |sort -n -r |uniq -c | sort -n -r | head -20
Before the most visited 20 individual ip
10、 Access times through subdomain , basis referer To calculate , A little wrong
cat access.log | awk ‘{print $11}’ | sed -e ’ s/http:’ -e ’ s//.*//’ | sort | uniq -c | sort -rn | head -20
11、 List the files with the largest transfer size
cat www.access.log |awk ‘($7~/.php/){print $10 " " $1 " " $4 " " $7}’|sort -nr|head -100
12、 List output greater than 200000byte( about 200kb) Page of and the number of occurrences of the corresponding page
cat www.access.log |awk ‘($10 > 200000 && $7~/.php/){print $7}’|sort -n|uniq -c|sort -nr|head -100
13、 If the last column of the log records the page file transfer time , There are pages that list the most time consuming to the client
cat www.access.log |awk ‘($7~/.php/){print $NF " " $1 " " $4 " " $7}’|sort -nr|head -100
14、 List the most time consuming pages ( exceed 60 Of a second ) And the number of times the corresponding page occurs
cat www.access.log |awk ‘($NF > 60 && $7~/.php/){print $7}’|sort -n|uniq -c|sort -nr|head -100
15、 List transfer time exceeds 30 Seconds of files
cat www.access.log |awk ‘($NF > 30){print $7}’|sort -n|uniq -c|sort -nr|head -20
16、 Lists the number of processes running on the current server , Reverse order
ps -ef | awk -F ’ ’ ‘{print $8 " " $9}’ |sort | uniq -c |sort -nr |head -20
17、 see apache Current concurrent accesses
contrast httpd.conf in MaxClients What's the difference
netstat -an | grep ESTABLISHED | wc -l
18、 You can use the following parameters to view the data
ps -ef|grep httpd|wc -l
Statistics httpd Number of processes , Even requests start a process , Used in Apache The server . Express Apache Able to handle N Four concurrent requests , This value Apache It can be adjusted automatically according to the load
netstat -nat|grep -i “80”|wc -l
netstat -an Will print the current network link status of the system , and grep -i “80” It is used to extract and 80 Port related connections ,wc -l Make connection statistics .
The final number returned is all 80 Total requests for port
netstat -na|grep ESTABLISHED|wc -l
netstat -an Will print the current network link status of the system , and grep ESTABLISHED Extract the information of established connection . then wc -l Statistics
The final number returned is all 80 Total number of established connections for port .
netstat -nat||grep ESTABLISHED|wc
Detailed records of all established connections can be viewed
19、 Output each ip Connection number , And the total number of connections in each state
netstat -n | awk ‘/^tcp/ {n=split( ( N F − 1 ) , a r r a y , " : " ) ; i f ( n < = 2 ) + + S [ a r r a y [ ( 1 ) ] ] ; e l s e + + S [ a r r a y [ ( 4 ) ] ] ; + + s [ (NF-1),array,":");if(n<=2)++S[array[(1)]];else++S[array[(4)]];++s[ (NF−1),array,":");if(n<=2)++S[array[(1)]];else++S[array[(4)]];++s[NF];++N} END {for(a in S){printf(“%-20s %s\n”, a, S[a]);++I}printf(“%-20s %s\n”,“TOTAL_IP”,I);for(a in s) printf(“%-20s %s\n”,a, s[a]);printf(“%-20s %s\n”,“TOTAL_LINK”,N);}’
20、 Other collections
Analysis log file 2012-05-04 Highest access page Before 20 individual URL And sort
cat access.log |grep ‘04/May/2012’| awk ‘{print $11}’|sort|uniq -c|sort -nr|head -20
Query the URL In the address contain www.abc.com Of IP Address
cat access_log | awk ‘($11~/\www.abc.com/){print $1}’|sort|uniq -c|sort -nr
Get the most visited 10 individual IP Address You can also query by time
cat linewow-access.log|awk ‘{print $1}’|sort|uniq -c|sort -nr|head -10
Time period query log time period
cat log_file | egrep ‘15/Aug/2015|16/Aug/2015’ |awk ‘{print $1}’|sort|uniq -c|sort -nr|head -10
analysis 2015/8/15 To 2015/8/16 visit ”/index.php?g=Member&m=Public&a=sendValidCode” Of IP Reverse order
cat log_file | egrep ‘15/Aug/2015|16/Aug/2015’ | awk ‘{if($7 == “/index.php?g=Member&m=Public&a=sendValidCode”) print $1,$7}’|sort|uniq -c|sort -nr
($7~/.php/) $7 It contains .php The output of , The most time-consuming one hundred PHP page
cat log_file |awk ‘($7~/.php/){print $NF " " $1 " " $4 " " $7}’|sort -nr|head -100
List the most time consuming pages ( exceed 60 Of a second ) And the number of times the corresponding page occurs
cat access.log |awk ‘($NF > 60 && $7~/.php/){print $7}’|sort -n|uniq -c|sort -nr|head -100
Statistics website traffic (G)
cat access.log |awk ‘{sum+=$10} END {print sum/1024/1024/1024}’
Statistics 404 The connection of
awk ‘($9 ~/404/)’ access.log | awk ‘{print $9,$7}’ | sort
Statistics http status
cat access.log |awk ‘{counts[$(9)]+=1}; END {for(code in counts) print code, counts[code]}’
cat access.log |awk ‘{print $9}’|sort|uniq -c|sort -rn
Concurrency per second
watch “awk ‘{if($9~/200|30|404/)COUNT[$4]++}END{for( a in COUNT) print a,COUNT[a]}’ log_file|sort -k 2 -nr|head -n10”
Bandwidth statistics
cat apache.log |awk ‘{if($7~/GET/) count++}END{print "client_request="count}’
Find the most visited 10 individual IP
cat /tmp/access.log | grep “20/Mar/2011” |awk ‘{print $3}’|sort |uniq -c|sort -nr|head
same day ip The highest number of connections ip What are you doing
cat access.log | grep “10.0.21.17” | awk ‘{print $8}’ | sort | uniq -c | sort -nr | head -n 10
Hour unit ip The most connected 10 Time periods
awk -vFS=“[:]” ‘{gsub(“-.*”,“”,$1);num[$2" "$1]++}END{for(i in num)print i,num[i]}’ log_file | sort -n -k 3 -r | head -10
Find the minutes with the most visits
awk ‘{print $1}’ access.log | grep “20/Mar/2011” |cut -c 14-18|sort|uniq -c|sort -nr|head
take 5 Minute log
if [ $DATE_MINUTE != $DATE_END_MINUTE ] ;then #
Then judge whether the start time stamp and the end time stamp are equal
START_LINE=sed -n “/$DATE_MINUTE/=” $APACHE_LOG|head -n1 # If it's not equal , The line number of the start time stamp is taken out , Line number with end timestamp
see tcp Link status for
netstat -nat |awk ‘{print $6}’|sort|uniq -c|sort -rn
netstat -n | awk ‘/^tcp/ {++S[$NF]};END {for(a in S) print a, S[a]}’
netstat -n | awk ‘/^tcp/ {++state[$NF]}; END {for(key in state) print key,“\t”,state[key]}’
netstat -n | awk ‘/^tcp/ {++arr[$NF]};END {for(k in arr) print k,“\t”,arr[k]}’
netstat -n |awk ‘/^tcp/ {print $NF}’|sort|uniq -c|sort -rn
netstat -ant | awk ‘{print $NF}’ | grep -v ‘[a-z]’ | sort | uniq -c
netstat -ant|awk ‘/ip:80/{split($5,ip,“:”);++S[ip[1]]}END{for (a in S) print S[a],a}’ |sort -n
netstat -ant|awk ‘/:80/{split($5,ip,“:”);++S[ip[1]]}END{for (a in S) print S[a],a}’ |sort -rn|head -n 10
awk ‘BEGIN{printf (“http_code\tcount_num\n”)}{COUNT[$10]++}END{for (a in COUNT) printf a"\t\t"COUNT[a]“\n”}’
Before number of search requests 20 individual IP( Often used to find attack source ):
netstat -anlp|grep 80|grep tcp|awk ‘{print $5}’|awk -F: ‘{print $1}’|sort|uniq -c|sort -nr|head -n20
netstat -ant |awk ‘/:80/{split($5,ip,“:”);++A[ip[1]]}END{for(i in A) print A[i],i}’ |sort -rn|head -n20
use tcpdump Sniffing 80 Port access to see who is the highest
tcpdump -i eth0 -tnn dst port 80 -c 1000 | awk -F"." ‘{print $1".“$2”.“$3”."$4}’ | sort | uniq -c | sort -nr |head -20
Find more time_wait Connect
netstat -n|grep TIME_WAIT|awk ‘{print $5}’|sort|uniq -c|sort -rn|head -n20
Find out more SYN Connect
netstat -an | grep SYN | awk ‘{print $5}’ | awk -F: ‘{print $1}’ | sort | uniq -c | sort -nr | more
Process by port
netstat -ntlp | grep 80 | awk ‘{print $7}’ | cut -d/ -f1
Looked at the number of connections and the current number of connections
netstat -ant | grep $ip:80 | wc -l
netstat -ant | grep $ip:80 | grep EST | wc -l
see IP Number of visits
netstat -nat|grep “:80”|awk ‘{print $5}’ |awk -F: ‘{print $1}’ | sort| uniq -c|sort -n
Linux Command to analyze the current link status
netstat -n | awk ‘/^tcp/ {++S[$NF]} END {for(a in S) print a, S[a]}’
watch “netstat -n | awk ‘/^tcp/ {++S[$NF]} END {for(a in S) print a, S[a]}’” # adopt watch Can be monitored all the time
LAST_ACK 5 # Close a TCP The connection needs to be closed in two directions , Both sides send FIN To indicate the closing of unidirectional data , When the communication parties send the last FIN When , Sender at this time LAST_ACK state , When the sender receives the confirmation from the other party (Fin Of Ack confirm ) Only after that can we really close the whole TCP Connect ;
SYN_RECV 30 # Indicates the number of requests waiting to be processed ;
ESTABLISHED 1597 # Indicates normal data transmission status ;
FIN_WAIT1 51 # Express server End active request closed tcp Connect ;
FIN_WAIT2 504 # Indicates that the client is disconnected ;
TIME_WAIT 1057 # Indicates processing is complete , Number of requests waiting for timeout to end ;
边栏推荐
- How to use maixll dock
- Seven crimes of counting software R & D Efficiency
- JS rotation chart
- LM small programmable controller software (based on CoDeSys) note 19: errors do not match the profile of the target
- Day 52 - tree problem
- 2022 tea master (intermediate) examination question bank and tea master (intermediate) examination questions and analysis
- MySQL function variable stored procedure
- 一些小知识点
- Offline installation of Wireshark 2.6.10
- 细数软件研发效能的七宗罪
猜你喜欢

After many job hopping, the monthly salary is equal to the annual salary of old colleagues

Tencent has five years of testing experience. It came to the interview to ask for 30K, and saw the so-called software testing ceiling

软件研发的十大浪费:研发效能的另一面

2022.2.7-2.13 AI industry weekly (issue 84): family responsibilities

Shell之一键自动部署Redis任意版本

Offline installation of Wireshark 2.6.10

2022危险化学品生产单位安全生产管理人员题库及答案

206. reverse linked list

Internet winter, how to spend three months to make a comeback

Knowledge supplement: redis' basic data types and corresponding commands
随机推荐
Embedded System Development Notes 81: Using Dialog component to design prompt dialog box
2022年上海市安全员C证考试题模拟考试题库及答案
[today in history] June 30: von Neumann published the first draft; The semiconductor war in the late 1990s; CBS acquires CNET
嵌入式系统开发笔记81:使用Dialog组件设计提示对话框
[untitled]
Offline installation of Wireshark 2.6.10
2022年T电梯修理题库及模拟考试
2022年聚合工艺考试题及模拟考试
Measurement of quadrature axis and direct axis inductance of three-phase permanent magnet synchronous motor
Registration for R2 mobile pressure vessel filling test in 2022 and R2 mobile pressure vessel filling free test questions
2. Use of classlist (element class name)
Knowledge supplement: basic usage of redis based on docker
Qt development experience tips 226-230
C language games (I) -- guessing games
Concurrent mode of different performance testing tools
Shell之一键自动部署Redis任意版本
JVM栈和堆简介
Chen Yu (Aqua) - Safety - & gt; Cloud Security - & gt; Multicloud security
Why is Hong Kong server most suitable for overseas website construction
2022 t elevator repair new version test questions and t elevator repair simulation test question bank