当前位置:网站首页>Awk tools
Awk tools
2022-06-26 13:22:00 【C chord~】
Catalog
3、 ... and .awk Common built-in variables
3. Through the pipe symbol 、 Double quotes call shell command
introduction
awk Command is a programming language , Used in linux/unix Processing of text and data . It is a programming language specially designed for text processing , It's also line processing software , Usually used to scan 、 Filter 、 Statistical summary work .
One . principle
Read the text line by line , The default is space or tab Key to separate the separator , Save the separated fields to the built-in variables , And execute the edit command according to the mode or condition
sed Commands are often used for a whole line of processing , and awk They tend to divide a line into multiple “ Field ” And then deal with it
awk Information is also read line by line , The execution result can be obtained through print To print and display field data
In the use of awk In the course of the order , You can use logical operators “&&” Express “ And ”、“||” Express “ or ”、“!” Express “ Not ”
You can also do simple mathematical operations , Such as +、-、*、/、%、^ Respectively means plus 、 reduce 、 ride 、 except 、 The remainder and the power
Two . Command format
- awk Options " Mode or condition [ operation ]" file 1 file 2
- awk -f Script files file 1 file 2 …
3、 ... and .awk Common built-in variables
Variable explain
FS Column separator , Specify the field separator for each line of text , Default to space or tab stop . And "-F" The same effect
NF Number of fields in the row currently processed
NR Line number of the currently processed line ( Ordinal number )
$0 The entire contents of the currently processed row
$n Of the current processing line n A field ( The first n Column )
FILENAME File name processed
RS Line separator ,awk When reading from a file , Based on the RS The definition of cut data into many records , and awk Read only one record at a time , To deal with . The default is ’\n’
Case study
1. Output text by line
awk '{print}' like.txt
# Output everything
awk '{print $0}' like.txt
# Output everything
awk 'NR==1,NR==3{print}' like.txt
# Output No 1~3 Row content
awk '(NR>=1)&&(NR<=3){print}' like.txt
# Output No 1~3 Row content
awk 'NR==1||NR==3{print}' like.txt
# Output No 1 That's ok 、 The first 3 Row content

awk '(NR%2)==1{print}' 1.txt
# Output the contents of all odd lines
awk '(NR%2)==0{print}' 1.txt
# Output the contents of all even lines
awk '/^root/{print}' /etc/passwd
# Output to root Beginning line
awk '/nologin$/{print}' /etc/passwd
# Output to nologin The line at the end
awk 'BEGIN {x=0};/\/bin\/bash$/{x++};END {print x}' /etc/passwd
# Statistics to /bin/bash Number of lines at the end
# Equate to grep -c "/bin/bash$" /etc/passwd
#BEGIN Pattern representation , Before processing the specified text , It has to be executed first BEGIN Actions specified in the pattern
#awk Reprocess the specified text , We'll do it later END Actions specified in the pattern
#END{} In the block , Often put in the print results and other statements
2. Output text by field
awk -F ":" '{print $3}' /etc/passwd
# Output in each line ( Separated by spaces or tab stops ) Of the 3 A field
awk -F ":" '{print $1,$3}' /etc/passwd
# Output the 1、3 A field
awk -F ":" '$3<5{print $1,$3}' /etc/passwd
# Output No 3 Field values are less than 5 Of the 1、3 Field contents

awk -F ":" '!($3<200){print}' /etc/passwd
# Output No 3 The value of each field is not less than 200 The line of
awk 'BEGIN {FS=":"};{if($3>=200){print}}' /etc/passwd
# I'll finish it first BEGIN The content of , And print out the contents of the text
awk -F ":" '{max=($3>$4)?$3:$4;{print max}}' /etc/passwd
#($3>$4)?$3:$4 Ternary operator , If the first 3 The value of field is greater than 4 Values for fields , Then put the 3 The value of a field is assigned to max, Otherwise, No 4 The value of a field is assigned to max
awk -F ":" '{print NR,$0}' /etc/passwd
# Output each line content and line number , Every time a record is processed ,NR It's worth adding 1
3. Through the pipe symbol 、 Double quotes call shell command
echo $PATH | awk 'BEGIN{RS=":"};END{print NR}'
# Count the number of colon separated text paragraphs ,END{} In the block , Often put in the print results and other statements
awk -F: '/bash$/{print | "wc -l"}' /etc/passwd
# call wc -l Command statistics usage bash Number of users , Equate to grep -c "bash$" /etc/passwd
free -m | awk '/Mem:/ {print int($3/($3+$4)*100)}'
# View the current memory usage percentage
top -b -n 1 | grep Cpu | awk -F ',' '{print $4}' | awk '{print $1}'
# View the current CPU Idle rate ,(-b -n 1 It means that you only need 1 The output of times )
date -d "$(awk -F "." '{print $1}' /proc/uptime) second ago" +"%F %H:%M:%S"
# Display the last system restart time , Equate to uptime;second ago To show how many seconds ago ,+"%F %H:%M:%S" Equate to +"%Y-%m-%d %H:%M:%S" Time format of
awk 'BEGIN {while ("w" | getline) n++ ; {print n-2}"%"}'
# call w command , And used to count the number of online users
awk 'BEGIN {"hostname" | getline ; {print $0}}'
# call hostname, And output the current host name
seq 10 | awk '{print $0; getline}'
seq 10 | awk '{getline; print $0}'
边栏推荐
- What should the software test report include? Interview must ask
- Typescript
- Processing polyhedron change
- 【Spark】. Explanation of several icons of scala file in idea
- Beifu twincat3 can read and write CSV and txt files
- Solutions to insufficient display permissions of find and Du -sh
- Stack, LIFO
- To solve the difficulties of small and medium-sized enterprises, Baidu AI Cloud makes an example
- Hdu1724[Simpson formula for integral]ellipse
- Beifu PLC obtains system time, local time, current time zone and system time zone conversion through program
猜你喜欢

Appearance mode (facade)

三维向量的夹角

Machine learning notes - seasonality of time series

Beifu cx5130 card replacement and transfer of existing authorization files

Es6: iterator

10秒内完成火灾预警,百度智能云助力昆明官渡打造智慧城市新标杆

Opencv high speed download

Basic methods for network diagnosis and hardware troubleshooting of Beifu EtherCAT module

首批通过!百度智能云曦灵平台获信通院数字人能力评测权威认证

5月产品升级观察站
随机推荐
D - 滑雪
C - Common Subsequence
Beifu PLC realizes data power-off maintenance based on cx5130
Processsing mouse interactive learning
Sinotech software outsourcing
H5 video automatic playback and circular playback
Beifu PLC model selection -- how to see whether the motor is a multi turn absolute value encoder or a single turn absolute value encoder
Chapter 10 setting up structured logging (2)
Map value
MySQL数据库讲解(五)
Ubuntu installation and configuration PostgreSQL (18.04)
HDU 3555 Bomb
Bridge mode
shell脚本详细介绍(四)
Processing function translate (mousex, mousey) learning
三维向量的夹角
Log in to the server using SSH key pair
Machine learning notes - seasonality of time series
sed编辑器
IDC report: the AI cloud market share of Baidu AI Cloud ranks first for six consecutive times