man bash
man bash | wc -l
info bash | wc -l
http://goo.gl/S2Jj2n
execute the script
Make scripts executable with chmod u+x
chmod +x file.sh
The bash time command
Bash has a builtin time command to report how much time a process consumed
time find / -name File_name
Variables in Bash
for a shell script to get a copy of a shell variable, it needs to be "exported":
export mynewvar or declare -x mynewvar
you can export and assign in the same statement:
export var2="var2 value".
export -f myFunc will export the function myFunc.
a=1
(a=2)
echo $a
prints 1
a=1
{a=2}
echo $a
prints 2
see built in commands
$enable
see bash keywords
$compgen -k
Bash startup
.bash_profile is read when bash is invoked as alogin shell.
.bashrc is executed when a new shell is started.
if you extend an exported variable, like PATH, in .bashrc, it will grow with each nested shell invocation.
PATH=$PATH:/usr/local/bin
This would keep adding /usr/local/bin to the end of PATH within nested shells.
Aliases and functions should normally be defined in .bashrc.
Sourcing Scripts:
source example.sh, or . example.sh
it is "dot space" as a short way to source a script.
The shell executed the script in the shell's own process instead of in a new process.
Sourcig is common way to import variable assignments or functions
The sourced script is executed within the calling shell's process.
Wroking with Aliases
The alias command allows for short command alternatives: alias ll="ls -l"
some people use an alias to give an alternative, more familiar name or to fix a common typo to a Linux command.
$source ./script_file.sh
Note: what this script contains we can use variable in further.
Working with Aliases
alias copy=cp
alias rn=mv
alias mroe=more
mroe myfile
ls -l | mroe
List defined aliases by simply typing alias
Unset an alias with the unalias command
Using the Echo Command
Built into bash and doesn't start a new process
-n ->don't print a trailing newline
-e ->enable backslash escaped characters like \n and \t
-E ->disable backslash escaped characters in case they were enabled by default
Ex:e
echo Hello World
echo -n Good to see you "\n\n"
echo Thanks
echo -e Hi "\t\t\t" There "\n\n"
echo -E Bye "\t\t\t" For now "\n\n"
ls * => would list contents of directories
echo * =>would show file and directory names
Use file redirection techniques to send the output to other files, such as stderr:
echo 'Warning will Robinson!' >&2
Local Variables and Typeset
Variables can be created in function that will not be available outside of it.
The typeset command makes variables local, can provide a type, or can provide formatting.
typeset -i x
#x must be an integer
Arithmetic is faster for variables defined to be integers.
Let allows for convenient arithmetic:
let x++; let y=x**2; let x=x*3; let x*=5, ....
The Declare Command
declare -l uppercase values in the variable are converted to lowercase.
declare -u lowercase values in the variable are converted to uppdercase.
declare -r variable is made read-only.
declare -a MyArray will make MyArray an indexed array.
declare -A MyArray2 will make MyArray2 an associative array.
function f1 {
typeset x
x=7
y=8
}
x=1
y=2
echo x is $x
echo y is $y
f1
echo x is $x
echo y is $y
output:
x is 1
y is 2
x is 1
y is 8
-------------------
declare -l lstring="ABCdef"
declare -u ustring="ABCdef"
declare -r readonly="A Value"
declare -a Myarray
declare -A Myarray2
echo lstring = $lstring
echo ustring = $ustring
echo readonly = $readonly
readonly="New Value"
Myarray[2]="Second Value"
echo 'Myarray[2]= ' ${Myarray[2]}
Myarray2["hotdog"]="baseball"
echo 'Myarray2[hotdog]= ' ${Myarray2["hotdog"]}
Output:
lstring = abcdef
ustring = ABCDEF
readonly = A Value
declare.sh: line 11: readonly: readonly variable
Myarray[2]= Second Value
Myarray2[hotdog]= baseball
The Read Command
Read a line into a variable or multiple variables
read a b---reads first word into a and the rest into b
convenient for a while loop
While Loops
While
command list1
do
command list
done
#loops while command list 1 succeeds
Ex:
while ((x<10))
do
echo loop $x;
date >date.$x
((x++))
done
-------------
while
read a b
do
echo a is $a b is $b
done <data_file
-------------
ls -l | while
read a b c d
do
echo owner is $c
done
--------------
For Loops
for <var> in <list>
do
command list
done
for i dog cat elephant
do
echo i is $i
done
----------------
seq 1 5
#prints 12345
for num in `seq 1 5`
#loops over 1 2 3 4 5
generate sequence with {A..Z}, {1..10}
for d in $(<data_file)
#loops over space/newline
#separated data in data_files
-----
for j in *.c
#making a list with file globbing
----
for f in $(find . -name *.c)
#using a command to generate a list
-----------
export a=first
export b=second
export c=third
echo a is '['$a']' b is '['$b']' c is '['$c']'
read a b <data_file
echo a is '['$a']' b is '['$b']' c is '['$c']'
data_file:
dog cat rooster
elephant hen rabbit snake
carrot lettuce pineapple banana pizza
output:
a is [first] b is [second] c is [third]
a is [dog] b is [cat rooster] c is [third]
--------
ls -l /etc | while
read a b c d
do
echo owner is $c
done
----------------
nl for.sh => nl number line
1 #!/bin/bash
2 for i in dog cat hotdog
3 do
4 echo i is $i
5 done
6 for i in `seq 3 5`
7 do
8 echo i in seq is $i
9 done
10 for i in {N..P}
11 do
12 echo i in letter list is $i
13 done
14 for d in $(<data_file)
15 do
16 echo d in data_file is $d
17 done
18 for f in $(find /etc 2>/dev/null | grep grub)
19 do
20 echo grub named things are $f
21 done
Functions:
Give a name to a sequence of statements that will execute within the shell, not in a new process
function NAME {
function body........
}
commonly used to organize code in a shell program
function printhello{
echo Hello
}
printhello
#shell memorizes the function like it's a new command
Return Command:
fucntions return when there are no more statements or when a return statement is executed.
function myFunc{
echo starting
return
echo this will not be executed
}
The return command:
functions produce results by writing output like commands do.
hvar=$(printhello)
The Exit command:
exit <value> sets the exit status, represented by $? to <value>
exit terminates the shell process
exit in a function terminates the whole shell program, not just the function.
Ex:
function myfunc {
echo starting myfunc
return
echo this will not be executed
}
myfunc
n=$(myfunc)
echo n is $n
Output:
starting myfunc
n is starting myfunc
---------------------
Redirection and pipes:
Processes normally have three files open:
0 =>stdin, 1 =>stdout, 2 =>stderr
Command > stdout-here 2> stderr-here < stdin-from-here
Command &> file
#file gets stdout and stderr from command, file is created or overwritten
Redirection and Pipes:
command | command2
#command2's stdin comes from command's stdout
command 2>&1 | command2
#gets stdout and stderr from command
command |& command2
#alternative way for command2 to get command's stdout and stderr as its stdin
command >> file
#appends stdout of command to end of file
command &>> file
#appends stdout and stderr of command to end of file
Here Documents: <<
Here documents are a way to embed input for standard input inside of a script.
They avoid having to create a new file just to hold some input values.
sort <<END
cherry
banana
apple
orange
END
Open and Close file descriptiors:
exec N< myfile
#opens file descriptor N for reading from file myfile
exec N> myfile
#opens file descriptor N for writing to myfile
exec N<> myfile
#opens file descriptor N for reading & writing with myfile
exec N>&- or exec N<&-
#closes file descriptor N
Use lsof to see what file descriptors for a process are open
exec 7>/tmp/myfile7
lsof -p $$
# $$ is shell's PID
echo Just '>' ---------------------------------------
find /etc -name grub >grub.out
echo Doing '2>' ---------------------------------------
find /etc -name grub 2>errs.out
echo Doing '&>' ---------------------------------------
find /etc -name grub &>both.out
> only standard output
2> only standard error
&> both standard output and errors
find /etc -name grub |& grep grub
|& => first command success ouput without errors passes to input to right side of pipe command
echo hi >myfile
replace content myfile with hi
echo cheese >>myfile
append data at bottom of myfile.
Sorting:
sort <<END
<content>
END
Ex:
sort <<END
cherry
banana
apple
orange
END
while
read a b c
do
echo a: $a b:$b c:$c
done <<EOF
one two three four
five six seven eight nine ten
eleven twelve
EOF
The Case Statements:
case expression in
pattern 1)
command list;;
pattern 2)
command list;;
...
esac
ex:
case $ans in
yes|YES|y|Y|y.x ) echo "will do!";;
n*|N*) echo "will not do!";;
*) echo "Oops!";;
esac
The If-Then-Else Statement
if
command list # last result is used
then
command list
[else
command list]
fi
EX:
if
grep -q important myfile
then
echo myfile has important stuff
else
echo myfile does not have important stuff
fi
Tests in Bash
The builtin test is used to check various conditions and set the return code with the result.
Loops and conditionals often use the result of test
An alternative to tet is [[]] or (())
ex: Test Example
if
test -f afile
if [[ -f bfile ]] #instead of 'test' [[]]
if
test $x -gt 5
Test Operator
Numeric Comparision:
[[ ex1 -eq ex2 ]] [[ ex1 -nq ex2 ]]
[[ ex1 -lt ex2 ]] [[ ex1 -gt ex2 ]]
[[ ex1 -le ex2 ]] [[ ex1 -ge ex2 ]]
or
(( ex1 == ex2 )) (( ex1 != ex2 ))
(( ex1 < ex2 )) (( ex1 > ex2 ))
(( ex1 <= ex2 )) (( ex1 >= ex2 ))
(( ex1 && ex2 )) (( ex1 || ex2 ))
(( expr?expr:expr))
test -d X => success if X is a directory
test -f X => success if X is a regular non-empty file
test -s X => success if X exists non-empty
test -x X => success if you have x permission on X
test -w X => success if you have w permission on X
test -r X => success if you have r permission on X
Ex:
x=01
y=1
echo comparing $x and $y
if
[ $x == $y ]
then
echo ==
else
echo not ==
fi
if
[ $x -eq $y ]
then
echo eq
else
echo not eq
fi
if
((x==y))
then
echo '(())' ==
else
echo not '(())' ==
fi
Ex: 2
if
test -x /bin/ls
then
if
[ ! -w /etc/hosts ]
then
if
echo about to look for foobar
grep -q foobar /etc/passwd
then
echo foobar found in /etc/passwd
else
echo foobar not found
fi
fi
else
echo Oh no, /bin/ls not executable
fi
Filters:
In Linux, a program is a filter if it reads from stdin and writes to stdout.
Filters can be used in pipes.
Filters provide the powerful means of combining the input and output of a sequence of commands to get the kind of report that you want.
The Head and Tail Commands
head prints the first n lines of a file or stdin.
tail prints the last n lines of a file or stdin
ls -l |head -5 #first 5 lines of ls -l
ls -l |tail -7 #last 7 lines of ls -l
ls -l |head -10 |tail -5 #lines 6-10
The Head and Tail Commands
wc(word count) prints line, word, and char counts
wc -l prints the number of lines
ls |wc -l prints number of entries in directory.
$./makeoutput.sh >output & => end & uses to run process background
$tail -n2 -f output
#!/bin/bash
for i in {1..100}
do
read a b c d e <<END
$(date)
END
echo $d
sleep 1
done
The Command sed:
is a stream editor, which means it is not interactive
works great as a filter
is ideal for batch editing tasks
Usually applies its editing to all lines in the input
with the -i option, change a file instead of echoing the modified file to stdout
Using sed Substitute:
sed 's/old/new/' myfile
Substitute the first occurrence of old on each line for new in the file myfile and display the result on stdout
old is a pattern and can be regular expression
The / is the usual character to separate the old from the new.
the file myfile will not be changed; the new version is echoed to stdout
No options are required for simple substitutions.
Ex:
sed 's/@home/@domicile/; s/truck/lorrie/'
sed -e 's/[xX]/Y/' -e 's/b.*n/blue/'
sed -f sedscript -n sed4
date | sed 's/J/j/'
sed '1,5p'
sed '/alpha/s/beta/gamma/'
sed '/apple/,/orange/d'
sed '/important/!s/print/throw_away/'
The awk Language:
A pattern matching language
An interpreted programming language that orks as a filter
Good for report writing
Handy for short algorithmic kinds of processing
processes a line at a time like sed
breaks each line into fields, $1, $2, etc.,
Fields are delimited by the values in the variable FS, normally white space.
$0 is the entire line(record)
EX:
$ps -el | \
awk '/pts/||$8~/35/{printf("%5d %5d %s\n", $4, $5, $14)}'
Ex:
sed2:
s/a/A/
s/B/BBB/
$ cat sometext:
now we have
some words and
fruit like apple cherry orange peach
and BIG things like
cruise ship, skyscraper, Bigfoot
$sed -f sed2 sometext
now we hAve
some words And
fruit like Apple cherry orange peach
And BBBIG things like
cruise ship, skyscrAper, BBBigfoot
Script Parameters and {}:
Parameters to a shell program: $1, $2
called "positional parameters"
to reference multidigit use {},e.g., ${10}
$0 is the path to the program itself:
For ex, echo Usage: $0 arg1 .....
Shift moves $2 into $1,$3 into $2, etc.
It is sometimes handy or required to use{} with named variable,. e.g., echo ${abc}DEF.
x=abc
abc=def
echo ${!x} prints def. indirection!
Unset or Null Variables:
${variable <OPR> value}
x=${var: -Hotdog}
:- if var unset/null , return value; otherwise, return value of var
:= if var unset/null var is assigned value & returned
:? Display an error and exit script if var unset/null
:+ if var unset/null return nothing; otherwise, return value
String Operations:
${var:offset}-value of var string at offset
${var:offset:len}-value of vr starting at offset upto length len
${#var}-length of var
${var#pre}-remove matching prefix
${var%post}-remove suffix-
Prefix and postfix -handy for processing filenames/paths
ex:#!/bin/bash
echo arg1 is $1 arg 11 is ${11}
shift
echo now arg1 is $1 arg 11 is ${11}
echo program is $0
$bash pos.sh {A..Z}
arg1 is A arg 11 is K
now arg1 is B arg 11 is L
program is pos.sh
ex:
x=abc
abc="Start Of Alphabet"
echo x is $x
echo abc is $abc
echo '${!x}' is ${!x}
output
x is abc
abc is Start Of Alphabet
${!x} is Start Of Alphabet
Ex:
#!/bin/bash
unset x
a=${x:-Hotdog}
echo a is $a
echo x is $x
a=${x:=Hotdog}
echo a is $a
echo x is $x
unset x
${x:?}
echo Will not get here
output:
a is Hotdog
x is
a is Hotdog
x is Hotdog
./unsetnull.sh: line 13: x: parameter null or not set
Ex:
#!/bin/bash
s="a string with words"
sub=${s:4}
echo sub is $sub
sub=${s:4:3}
echo sub is $sub
echo length of s is ${#s}
output:
sub is ring with words
sub is rin
length of s is 19
Ex:
p="/usr/local/bin/hotdog.sh"
echo whole path is $p
echo Remove prefix ${p#/*local/}
echo Remove suffix ${p%.sh}
cmd=${p#*/bin/}
cmd2=${cmd%.sh}
echo the command without .sh is $cmd2
output:
whole path is /usr/local/bin/hotdog.sh
Remove prefix bin/hotdog.sh
Remove suffix /usr/local/bin/hotdog
the command without .sh is hotdog
Advanced Bash:
Using Coprocesses:
a coprocess is a background process where your shell gets file descriptors for the process's stdin and stdout. Implemented with pipe
We need a script that is a filter.
#!/bin/bash
while
read line
do
echo $line | tr "ABC" "abc"
done
Using Coprocesses:
coproc ./mycoproc.sh
echo BANANA >&"${coproc[1]}"
cat <&"${COPROC[0]}"
or
coproc my { ./mycoproc.sh; }
echo BANANA >&"${my[1]}"
cat <&"${my[0]}"
Debugging script:
tee
ex: cmd | tee log.file | ...
man bash | wc -l
info bash | wc -l
http://goo.gl/S2Jj2n
execute the script
Make scripts executable with chmod u+x
chmod +x file.sh
The bash time command
Bash has a builtin time command to report how much time a process consumed
time find / -name File_name
Variables in Bash
for a shell script to get a copy of a shell variable, it needs to be "exported":
export mynewvar or declare -x mynewvar
you can export and assign in the same statement:
export var2="var2 value".
export -f myFunc will export the function myFunc.
a=1
(a=2)
echo $a
prints 1
a=1
{a=2}
echo $a
prints 2
see built in commands
$enable
see bash keywords
$compgen -k
Bash startup
.bash_profile is read when bash is invoked as alogin shell.
.bashrc is executed when a new shell is started.
if you extend an exported variable, like PATH, in .bashrc, it will grow with each nested shell invocation.
PATH=$PATH:/usr/local/bin
This would keep adding /usr/local/bin to the end of PATH within nested shells.
Aliases and functions should normally be defined in .bashrc.
Sourcing Scripts:
source example.sh, or . example.sh
it is "dot space" as a short way to source a script.
The shell executed the script in the shell's own process instead of in a new process.
Sourcig is common way to import variable assignments or functions
The sourced script is executed within the calling shell's process.
Wroking with Aliases
The alias command allows for short command alternatives: alias ll="ls -l"
some people use an alias to give an alternative, more familiar name or to fix a common typo to a Linux command.
$source ./script_file.sh
Note: what this script contains we can use variable in further.
Working with Aliases
alias copy=cp
alias rn=mv
alias mroe=more
mroe myfile
ls -l | mroe
List defined aliases by simply typing alias
Unset an alias with the unalias command
Using the Echo Command
Built into bash and doesn't start a new process
-n ->don't print a trailing newline
-e ->enable backslash escaped characters like \n and \t
-E ->disable backslash escaped characters in case they were enabled by default
Ex:e
echo Hello World
echo -n Good to see you "\n\n"
echo Thanks
echo -e Hi "\t\t\t" There "\n\n"
echo -E Bye "\t\t\t" For now "\n\n"
ls * => would list contents of directories
echo * =>would show file and directory names
Use file redirection techniques to send the output to other files, such as stderr:
echo 'Warning will Robinson!' >&2
Local Variables and Typeset
Variables can be created in function that will not be available outside of it.
The typeset command makes variables local, can provide a type, or can provide formatting.
typeset -i x
#x must be an integer
Arithmetic is faster for variables defined to be integers.
Let allows for convenient arithmetic:
let x++; let y=x**2; let x=x*3; let x*=5, ....
The Declare Command
declare -l uppercase values in the variable are converted to lowercase.
declare -u lowercase values in the variable are converted to uppdercase.
declare -r variable is made read-only.
declare -a MyArray will make MyArray an indexed array.
declare -A MyArray2 will make MyArray2 an associative array.
function f1 {
typeset x
x=7
y=8
}
x=1
y=2
echo x is $x
echo y is $y
f1
echo x is $x
echo y is $y
output:
x is 1
y is 2
x is 1
y is 8
-------------------
declare -l lstring="ABCdef"
declare -u ustring="ABCdef"
declare -r readonly="A Value"
declare -a Myarray
declare -A Myarray2
echo lstring = $lstring
echo ustring = $ustring
echo readonly = $readonly
readonly="New Value"
Myarray[2]="Second Value"
echo 'Myarray[2]= ' ${Myarray[2]}
Myarray2["hotdog"]="baseball"
echo 'Myarray2[hotdog]= ' ${Myarray2["hotdog"]}
Output:
lstring = abcdef
ustring = ABCDEF
readonly = A Value
declare.sh: line 11: readonly: readonly variable
Myarray[2]= Second Value
Myarray2[hotdog]= baseball
The Read Command
Read a line into a variable or multiple variables
read a b---reads first word into a and the rest into b
convenient for a while loop
While Loops
While
command list1
do
command list
done
#loops while command list 1 succeeds
Ex:
while ((x<10))
do
echo loop $x;
date >date.$x
((x++))
done
-------------
while
read a b
do
echo a is $a b is $b
done <data_file
-------------
ls -l | while
read a b c d
do
echo owner is $c
done
--------------
For Loops
for <var> in <list>
do
command list
done
for i dog cat elephant
do
echo i is $i
done
----------------
seq 1 5
#prints 12345
for num in `seq 1 5`
#loops over 1 2 3 4 5
generate sequence with {A..Z}, {1..10}
for d in $(<data_file)
#loops over space/newline
#separated data in data_files
-----
for j in *.c
#making a list with file globbing
----
for f in $(find . -name *.c)
#using a command to generate a list
-----------
export a=first
export b=second
export c=third
echo a is '['$a']' b is '['$b']' c is '['$c']'
read a b <data_file
echo a is '['$a']' b is '['$b']' c is '['$c']'
data_file:
dog cat rooster
elephant hen rabbit snake
carrot lettuce pineapple banana pizza
output:
a is [first] b is [second] c is [third]
a is [dog] b is [cat rooster] c is [third]
--------
ls -l /etc | while
read a b c d
do
echo owner is $c
done
----------------
nl for.sh => nl number line
1 #!/bin/bash
2 for i in dog cat hotdog
3 do
4 echo i is $i
5 done
6 for i in `seq 3 5`
7 do
8 echo i in seq is $i
9 done
10 for i in {N..P}
11 do
12 echo i in letter list is $i
13 done
14 for d in $(<data_file)
15 do
16 echo d in data_file is $d
17 done
18 for f in $(find /etc 2>/dev/null | grep grub)
19 do
20 echo grub named things are $f
21 done
Functions:
Give a name to a sequence of statements that will execute within the shell, not in a new process
function NAME {
function body........
}
commonly used to organize code in a shell program
function printhello{
echo Hello
}
printhello
#shell memorizes the function like it's a new command
Return Command:
fucntions return when there are no more statements or when a return statement is executed.
function myFunc{
echo starting
return
echo this will not be executed
}
The return command:
functions produce results by writing output like commands do.
hvar=$(printhello)
The Exit command:
exit <value> sets the exit status, represented by $? to <value>
exit terminates the shell process
exit in a function terminates the whole shell program, not just the function.
Ex:
function myfunc {
echo starting myfunc
return
echo this will not be executed
}
myfunc
n=$(myfunc)
echo n is $n
Output:
starting myfunc
n is starting myfunc
---------------------
Redirection and pipes:
Processes normally have three files open:
0 =>stdin, 1 =>stdout, 2 =>stderr
Command > stdout-here 2> stderr-here < stdin-from-here
Command &> file
#file gets stdout and stderr from command, file is created or overwritten
Redirection and Pipes:
command | command2
#command2's stdin comes from command's stdout
command 2>&1 | command2
#gets stdout and stderr from command
command |& command2
#alternative way for command2 to get command's stdout and stderr as its stdin
command >> file
#appends stdout of command to end of file
command &>> file
#appends stdout and stderr of command to end of file
Here Documents: <<
Here documents are a way to embed input for standard input inside of a script.
They avoid having to create a new file just to hold some input values.
sort <<END
cherry
banana
apple
orange
END
Open and Close file descriptiors:
exec N< myfile
#opens file descriptor N for reading from file myfile
exec N> myfile
#opens file descriptor N for writing to myfile
exec N<> myfile
#opens file descriptor N for reading & writing with myfile
exec N>&- or exec N<&-
#closes file descriptor N
Use lsof to see what file descriptors for a process are open
exec 7>/tmp/myfile7
lsof -p $$
# $$ is shell's PID
echo Just '>' ---------------------------------------
find /etc -name grub >grub.out
echo Doing '2>' ---------------------------------------
find /etc -name grub 2>errs.out
echo Doing '&>' ---------------------------------------
find /etc -name grub &>both.out
> only standard output
2> only standard error
&> both standard output and errors
find /etc -name grub |& grep grub
|& => first command success ouput without errors passes to input to right side of pipe command
echo hi >myfile
replace content myfile with hi
echo cheese >>myfile
append data at bottom of myfile.
Sorting:
sort <<END
<content>
END
Ex:
sort <<END
cherry
banana
apple
orange
END
while
read a b c
do
echo a: $a b:$b c:$c
done <<EOF
one two three four
five six seven eight nine ten
eleven twelve
EOF
The Case Statements:
case expression in
pattern 1)
command list;;
pattern 2)
command list;;
...
esac
ex:
case $ans in
yes|YES|y|Y|y.x ) echo "will do!";;
n*|N*) echo "will not do!";;
*) echo "Oops!";;
esac
The If-Then-Else Statement
if
command list # last result is used
then
command list
[else
command list]
fi
EX:
if
grep -q important myfile
then
echo myfile has important stuff
else
echo myfile does not have important stuff
fi
Tests in Bash
The builtin test is used to check various conditions and set the return code with the result.
Loops and conditionals often use the result of test
An alternative to tet is [[]] or (())
ex: Test Example
if
test -f afile
if [[ -f bfile ]] #instead of 'test' [[]]
if
test $x -gt 5
Test Operator
Numeric Comparision:
[[ ex1 -eq ex2 ]] [[ ex1 -nq ex2 ]]
[[ ex1 -lt ex2 ]] [[ ex1 -gt ex2 ]]
[[ ex1 -le ex2 ]] [[ ex1 -ge ex2 ]]
or
(( ex1 == ex2 )) (( ex1 != ex2 ))
(( ex1 < ex2 )) (( ex1 > ex2 ))
(( ex1 <= ex2 )) (( ex1 >= ex2 ))
(( ex1 && ex2 )) (( ex1 || ex2 ))
(( expr?expr:expr))
test -d X => success if X is a directory
test -f X => success if X is a regular non-empty file
test -s X => success if X exists non-empty
test -x X => success if you have x permission on X
test -w X => success if you have w permission on X
test -r X => success if you have r permission on X
Ex:
x=01
y=1
echo comparing $x and $y
if
[ $x == $y ]
then
echo ==
else
echo not ==
fi
if
[ $x -eq $y ]
then
echo eq
else
echo not eq
fi
if
((x==y))
then
echo '(())' ==
else
echo not '(())' ==
fi
Ex: 2
if
test -x /bin/ls
then
if
[ ! -w /etc/hosts ]
then
if
echo about to look for foobar
grep -q foobar /etc/passwd
then
echo foobar found in /etc/passwd
else
echo foobar not found
fi
fi
else
echo Oh no, /bin/ls not executable
fi
Filters:
In Linux, a program is a filter if it reads from stdin and writes to stdout.
Filters can be used in pipes.
Filters provide the powerful means of combining the input and output of a sequence of commands to get the kind of report that you want.
The Head and Tail Commands
head prints the first n lines of a file or stdin.
tail prints the last n lines of a file or stdin
ls -l |head -5 #first 5 lines of ls -l
ls -l |tail -7 #last 7 lines of ls -l
ls -l |head -10 |tail -5 #lines 6-10
The Head and Tail Commands
wc(word count) prints line, word, and char counts
wc -l prints the number of lines
ls |wc -l prints number of entries in directory.
$./makeoutput.sh >output & => end & uses to run process background
$tail -n2 -f output
#!/bin/bash
for i in {1..100}
do
read a b c d e <<END
$(date)
END
echo $d
sleep 1
done
The Command sed:
is a stream editor, which means it is not interactive
works great as a filter
is ideal for batch editing tasks
Usually applies its editing to all lines in the input
with the -i option, change a file instead of echoing the modified file to stdout
Using sed Substitute:
sed 's/old/new/' myfile
Substitute the first occurrence of old on each line for new in the file myfile and display the result on stdout
old is a pattern and can be regular expression
The / is the usual character to separate the old from the new.
the file myfile will not be changed; the new version is echoed to stdout
No options are required for simple substitutions.
Ex:
sed 's/@home/@domicile/; s/truck/lorrie/'
sed -e 's/[xX]/Y/' -e 's/b.*n/blue/'
sed -f sedscript -n sed4
date | sed 's/J/j/'
sed '1,5p'
sed '/alpha/s/beta/gamma/'
sed '/apple/,/orange/d'
sed '/important/!s/print/throw_away/'
The awk Language:
A pattern matching language
An interpreted programming language that orks as a filter
Good for report writing
Handy for short algorithmic kinds of processing
processes a line at a time like sed
breaks each line into fields, $1, $2, etc.,
Fields are delimited by the values in the variable FS, normally white space.
$0 is the entire line(record)
EX:
$ps -el | \
awk '/pts/||$8~/35/{printf("%5d %5d %s\n", $4, $5, $14)}'
Ex:
sed2:
s/a/A/
s/B/BBB/
$ cat sometext:
now we have
some words and
fruit like apple cherry orange peach
and BIG things like
cruise ship, skyscraper, Bigfoot
$sed -f sed2 sometext
now we hAve
some words And
fruit like Apple cherry orange peach
And BBBIG things like
cruise ship, skyscrAper, BBBigfoot
Script Parameters and {}:
Parameters to a shell program: $1, $2
called "positional parameters"
to reference multidigit use {},e.g., ${10}
$0 is the path to the program itself:
For ex, echo Usage: $0 arg1 .....
Shift moves $2 into $1,$3 into $2, etc.
It is sometimes handy or required to use{} with named variable,. e.g., echo ${abc}DEF.
x=abc
abc=def
echo ${!x} prints def. indirection!
Unset or Null Variables:
${variable <OPR> value}
x=${var: -Hotdog}
:- if var unset/null , return value; otherwise, return value of var
:= if var unset/null var is assigned value & returned
:? Display an error and exit script if var unset/null
:+ if var unset/null return nothing; otherwise, return value
String Operations:
${var:offset}-value of var string at offset
${var:offset:len}-value of vr starting at offset upto length len
${#var}-length of var
${var#pre}-remove matching prefix
${var%post}-remove suffix-
Prefix and postfix -handy for processing filenames/paths
ex:#!/bin/bash
echo arg1 is $1 arg 11 is ${11}
shift
echo now arg1 is $1 arg 11 is ${11}
echo program is $0
$bash pos.sh {A..Z}
arg1 is A arg 11 is K
now arg1 is B arg 11 is L
program is pos.sh
ex:
x=abc
abc="Start Of Alphabet"
echo x is $x
echo abc is $abc
echo '${!x}' is ${!x}
output
x is abc
abc is Start Of Alphabet
${!x} is Start Of Alphabet
Ex:
#!/bin/bash
unset x
a=${x:-Hotdog}
echo a is $a
echo x is $x
a=${x:=Hotdog}
echo a is $a
echo x is $x
unset x
${x:?}
echo Will not get here
output:
a is Hotdog
x is
a is Hotdog
x is Hotdog
./unsetnull.sh: line 13: x: parameter null or not set
Ex:
#!/bin/bash
s="a string with words"
sub=${s:4}
echo sub is $sub
sub=${s:4:3}
echo sub is $sub
echo length of s is ${#s}
output:
sub is ring with words
sub is rin
length of s is 19
Ex:
p="/usr/local/bin/hotdog.sh"
echo whole path is $p
echo Remove prefix ${p#/*local/}
echo Remove suffix ${p%.sh}
cmd=${p#*/bin/}
cmd2=${cmd%.sh}
echo the command without .sh is $cmd2
output:
whole path is /usr/local/bin/hotdog.sh
Remove prefix bin/hotdog.sh
Remove suffix /usr/local/bin/hotdog
the command without .sh is hotdog
Advanced Bash:
Using Coprocesses:
a coprocess is a background process where your shell gets file descriptors for the process's stdin and stdout. Implemented with pipe
We need a script that is a filter.
#!/bin/bash
while
read line
do
echo $line | tr "ABC" "abc"
done
Using Coprocesses:
coproc ./mycoproc.sh
echo BANANA >&"${coproc[1]}"
cat <&"${COPROC[0]}"
or
coproc my { ./mycoproc.sh; }
echo BANANA >&"${my[1]}"
cat <&"${my[0]}"
Debugging script:
tee
ex: cmd | tee log.file | ...
No comments:
Post a Comment