Linux awk 命令(Linux awk 命令语法与运算符)
awk 命令概述 :
awk 可用于处理文本文件 (pattern scanning and processing language)
$ awk --help
Usage: awk [POSIX or GNU style options] -f progfile [--] file ...
Usage: awk [POSIX or GNU style options] [--] 'program' file ...
POSIX options: GNU long options: (standard)
-f progfile --file=progfile
-F fs --field-separator=fs
-v var=val --assign=var=val
Short options: GNU long options: (extensions)
-b --characters-as-bytes
-c --traditional
-C --copyright
-d[file] --dump-variables[=file]
-e 'program-text' --source='program-text'
-E file --exec=file
-g --gen-pot
-h --help
-L [fatal] --lint[=fatal]
-n --non-decimal-data
-N --use-lc-numeric
-O --optimize
-p[file] --profile[=file]
-P --posix
-r --re-interval
-S --sandbox
-t --lint-old
-V --version
gawk is a pattern scanning and processing language.
By default it reads standard input and writes standard output.
Examples:
gawk '{ sum += $1 }; END { print sum }' file
gawk -F: '{ print $1 }' /etc/passwd
实例 :
(base) [biocodee@localhost ~]$ head log.txt
2 this is a test
3 Are you like awk
This's a test
10 There are orange,apple,mongo
(base) [biocodee@localhost ~]$ awk '{print $1,$4}' log.txt
2 a
3 like
This's
10 orange,apple,mongo
(base) [biocodee@localhost ~]$ awk '{printf "%-8s %-10s\n",$1,$4}' log.txt
2 a
3 like
This's
10 orange,apple,mongo
#指定分隔符 -F
(base) [biocodee@localhost ~]$ awk -F, '{print $1,$4}' log.txt
2 this is a test
3 Are you like awk
This's a test
10 There are orange
(base) [biocodee@localhost ~]$ awk 'BEGIN{FS=","} {print $1,$2}' log.txt
2 this is a test
3 Are you like awk
This's a test
10 There are orange apple
#多个分隔符:先用空格分割,然后对分割结果再使用","分割
(base) [biocodee@localhost ~]$ awk -F '[ ,]' '{print $1,$2,$5}' log.txt
2 this test
3 Are awk
This's a
10 There apple
(base) [biocodee@localhost ~]$ awk -va=1 '{print $1,$1+a}' log.txt
2 3
3 4
This's 1
10 11
(base) [biocodee@localhost ~]$ awk -va=1 -vb=s '{print $1,$1+a,$1b}' log.txt
2 3 2s
3 4 3s
This's 1 This'ss
10 11 10s
#执行脚本
$awk -f ***.awk log.txt
#过滤第一列大于2的行
awk '$1>2' log.txt
(base) [biocodee@localhost ~]$ awk '$1>2' log.txt
3 Are you like awk
This's a test
10 There are orange,apple,mongo
awk '$1=2' log.txt
(base) [biocodee@localhost ~]$ awk '$1=2' log.txt
2 this is a test
2 Are you like awk
2 a test
2 There are orange,apple,mongo
awk '$1>2 && $2=="Are" {print $1,$2,$3}' log.txt
(base) [biocodee@localhost ~]$ awk '$1>2 && $2=="Are" {print $1,$2,$3}' log.txt
3 Are you
(base) [biocodee@localhost ~]$ awk '$2 ~ /th/ {print $2,$4}' log.txt
this a
(base) [biocodee@localhost ~]$ awk 'BEGIN{IGNORECASE=1} /this/' log.txt
2 this is a test
This's a test
#模式取反
#awk '$2 !~ /th/ {print $2,$4}' log.txt
(base) [biocodee@localhost ~]$ awk '$2 !~ /th/ {print $2,$4}' log.txt
Are like
a
There orange,apple,mongo
#awk '!/th/ {print $2,$4}' log.txt
(base) [biocodee@localhost ~]$ awk '!/th/ {print $2,$4}' log.txt
Are like
a
There orange,apple,mongo
#`awk` 脚本
关键词 : BEGIN 和 END
执行前的语句 ----> BEGIN{***}
处理完所有的行后要执行的语句 -----> END{}
处理每一行时要执行的语句 -------> {}
(base) [biocodee@localhost blog]$ awk -f score_cal.awk score.txt
NAME NO. MATH ENGLISH COMPUTER TOTAL
---------------------------------------------
Marry 2143 78 84 77 239
Jack 2321 66 78 45 189
Tom 2122 48 77 71 196
Mike 2537 87 97 95 279
Bob 2415 40 57 62 159
---------------------------------------------
TOTAL: 319 393 350
AVERAGE: 63.80 78.60 70.00
##################################################################
#九九乘法表
(base) [biocodee@localhost blog]$ seq 9 | sed 'H;g' | awk -v RS='' '{for(i=1;i<=NF;i++)printf("%dx%d=%d%s",i,NR,i*NR,i==NR?"\n":"\t")}'
1x1=1
1x2=2 2x2=4
1x3=3 2x3=6 3x3=9
1x4=4 2x4=8 3x4=12 4x4=16
1x5=5 2x5=10 3x5=15 4x5=20 5x5=25
1x6=6 2x6=12 3x6=18 4x6=24 5x6=30 6x6=36
1x7=7 2x7=14 3x7=21 4x7=28 5x7=35 6x7=42 7x7=49
1x8=8 2x8=16 3x8=24 4x8=32 5x8=40 6x8=48 7x8=56 8x8=64
1x9=9 2x9=18 3x9=27 4x9=36 5x9=45 6x9=54 7x9=63 8x9=72 9x9=81
awk 运算符
运算符 | |
---|
= += -= *= /= %= ^= **= | 赋值 | ?: | C条件表达式 | || | 逻辑或 | && | 逻辑与 | ~ 和 !~ | 匹配正则表达式和不匹配正则表达式 | 空格 | 连接 | + - ! | 一元加,减 和逻辑非 | ^ *** | 求幂 | ++ – | 增加或者减少,作为前缀或者后缀 | $ | 字段引用 | in | 数组成员 |
reference : runnob 菜鸟教程 awk
|