Untitled Document

Chapter - 8 The Shell

The agent that sits between user and Unix / Linux system is SHELL. Whenever you log in to your system there is always a program running at background even though you see nothing on your prompt is SHELL and it never terminates unless you log out form your system . If you run ps command you will see that a program ( process ) called SHELL is already running at your system


The Shell is a unique and multi-faceted program . It is a command onterpreter and a programming language rolled into one .It is also a process that creates environment for you to work in .All the wonderful thing that we do in Unix / Linux using command is done by SHELL.Like enterpreting special symbols of command etc. We will some more details and intresting features about shell in this chalpter -

What Actually Shell Do ..??? 

Here are the key points that summarises what shell do for us -

  • The shell provide us prompt and waits for us to enter a command
  • After a command is entered shell search for metacharacters ( * used in rm and ls command , > used with cat command ).
  • Then the command is passed to the kernel for execution
  • Till the execution is in progress shell waits for its completion
  • Once the execution is successfull the the prompt is return back and shell is ready for more actions..

Shell Types - 

We have different types of shell ; it depends upon the user what kind of shell he/ she wants to use -

  • The Bourne shell (/bin/sh)
  • the Korn Shell (/bin/ksh) and bash (/bin/bash)
  • The C shell (/bin/csh) and its derivative ,Tcsh(/bin/tcsh)

To know your Shell , type below command -, the output displays the shell types and its absolute path name .

echo $SHELL

Pattern Matching Wild Cards -  

In this topic we will see the different types of wild cards or pattern matching characters that unix uses when we want to match the filenames and patterns from file . In our previous discussion we have seen some characters like which is one of the wild card .Lets see an example where we want to get list of the files begin with sam characters from a directory . One way to get this is give the list of file names with ls command .

ls sample.txt sample1.txt sample2.txt

Rather than typing every file name we use metacharacter * . Here * will first search for the filenames begining with sam and pass these names to ls command for displaying the details on terminal . These metachracters belongs to unix special category called wild cards.

ls sam *

We will discuss the significance of metacharacters in the wild cards set that are listed in below table.

Command Function
* Any number of character including none
? A single character
[ijk] A single character - either an i ,j or k
[x-z] A single character that is within the ASCII range of the characters x and z
[!ijk] A single character that is not an i, j or k
[!x-z] A single character that is not within the ASCII range of the characters x and z
{ pat1 ,pat2} pat1 ,pat2 , etc.

The * and ?  

Just now we have run the ls sam* , which lists down the files begining with sam . The metacharacter * is one of the characters of the shell's wild card set . It matches any number of characters (including none).What happens when you use echo with only the as argument?

echo *

You simply see a list of files ! All filenames in the current directory match a solitry * , so you will see all of them in the output.If you use rm * in this directory , all these files will be deleted .

* may occur anywhere in a filename and not meerly at the end. Thus *sam* matches all the following filenames - sample,newsample, testsample.

Now lets see another wild card , which matches a single character . If we use same string with wild card ? , sam? , the shell matches all 4 characters filenames begining with sam. Appending another ? creates the pattern sam??, which matches five characters filenmes. Try both the expressions you will get the different results.

  ls sam?
  sam1 ,sam2, sam3   --- output  
  ls sam??
  sam01 sam02        --- output 

Both the * and ? operate with some restrictions that are taken up in the next topic.

Matching The Dot -  

The behavior of * and ? in relation to the dot isn't straight forward as it may seem. The * doesn't match all the files begining with a . (dot) or the / of a pathname.If you want to list all hidden filenames in your directory have . at the begining and 2 character after the dot, the dot must be match explicitly .

ls .??*

However if the filename contains dot anywhere in the filename then it need not be match explicitly .For example sam*txt matches a dot embeded in the filename

ls sam*txt
There are two things that the * and ? can't match.First they don't match a filename begining with a dot , but they can match any number of embeded dots.For instance , sam*txt matches sam.txt , sam1.txt .Second , these characters don't match the / in a pathname . You can't use cd /mangep?shell_progs to switch to mangep/shell_progs

The Character Class -  

The patterns that we have seen are not very restrictive .you can frame more respective patterns with the character class. The character class compraises a set of characters enclosed by the rectangular brackets [ ] , but it matches a single character in class .The pattern[abcd] is character class, and it matches a single character - an a ,b ,c or d .This can be combined with any string or another wild card expression , so we can select sam01 , sam02 , sam03 by -

ls sam0[123]

we can even specify the range inside the character class with - (hyphen); The two character from the either side of the range to be matched.

  ls sam[1-3] ----------------------- lists sam1,sam2,sam3 filenames
  ls sam[a-c] ----------------------- lists sama ,samb,samc filenames 

A valid range specification requires that the characters on the left have a lower ASCII values that the one on the right.

The expression [a-zA-Z]* matches all filenames begining with an alhabet , irrespective of case.

Negating The Character Class (!) -  

We can even negate ( reverse) the pattern of searching used in character class. However the same feature doen't work in C shell , but with other shells , we can use ! as the first character in the class to negate the class. Here are the two examples that will give clear idea

  *. [ !ma ]  -------- matches all filenames with a single character extension but not with .m or .a files
  [ ! a-zA-Z] * ------- Matches all filenames that don't begin with an alphabetic character. 

Some Short Tricks - 

How one can copy all the C and Java files or source program from another directory ? Yes we can do this , Delimit the patterns with acomma , and then put curly braces { } around them ( no spaces please. !! ) . The command will copy all the C and java files from prog_sources to your home directory 

cp $HOME /prog_sources/* . { c,java } .

This will work in korn shell , C shell and Bash Shell .won't work in Bourne Shell . Using the curly braces , you can also access multiple directories.

cp /home/mangep/ {project,html,scripts} /* .

This copies all the files from three directories ( project , html and scripts } to the current directory .

Escaping and Quoting -

You must be thinking that if the shell uses some special characters to match the filenames then the filename themselves should not contain any special character , but in real world that's not the case . In our previous discussions on file chapters we ahve seen that , how easy to create a file named with sam* and if we run the below command all the filenames begins with sam will be listed along with sam*

  ls sam*
  sam sam* sam01 sam02 sample ------- Output Of the Above command  

sam* file be a great nuisance and should be removed immediately . But that won't be easy . Trying rm sam* will be dangerous as it will remove all the files starting with sam . we must be able to protect all special characters ( including wild cards) , so that shell is not able to interpret them . The Shell provides two solutions to prevent the special characters .

  • Escaping - Provide a \ ( backslash) before the wild card to remove (escape) its special meaning .
  • Quoting - Enclosing the wild card or even entire pattern in quotes (like 'chap*' ) . Anything within these quotes ( few exceptions) are left alone by the shell and not interpreted .

Escaping -

Placing a \ immediately before a metacharacter turns off its special meaning .For instance , in the pattern \ * , the \ tells the shell that the aestrick has to be matched litreally instead of being interpreted as metacharacter . Which allows us to remove sam* without removing the other files .

rm sam \* -------------------------This will not remove sam01 , sam02

The \ supresses the meaning of metacharacter * . This feature is known as Escaping . Consider another example . Run the following command which creates a file 

echo > chap 0[1-3] -------------------------creates a file chap0[1-3]

To read this file , we need to run -

ls chap 0\[1-3\]

Escaping Space - Apart from metacharacters , we can even escape the other characters like spacebar . So to remove file Mangesh Docs.txt run the following command -

rm Mangesh\  Docs.txt              ------------- without the \ rm would see two files

Escaping the \ itself -  Some time you need to escape / itself . To escape it you need another / before it .

  echo \\             	------------- prints \ (single) on o/p
  echo The new line character is \\n 
  The New line character is \n

Quoting -

There is another way to turn off the meaning of metacharacters . When the commnand arguments are enclosed in quotes the meaning of all special characters gets turned off . Here are few commands that contains ' '( single quotes) and " "(double quotes) that will turn off meaning of metacharacters .

  echo ' \ ' -------------- Displays a \
  rm 'chap*'  ------------- Removes file chap* 
  rm "My mangesh.txt" ----------- Removes file My mangesh.txt

Escaping is some time tidious when we have too many characters to protect .Quoting is often a better option . The following examples shows protection of 4 special characters using single quotes .

eco ' The Characters |,<,> and $ are special characters '

We could have use escaping here, but then we would need to provide 4 / infront of each special character . we used single quotes because they protect all special characters ( except the single quotes). Double quotes are more permissive they don't protect the $ and the ` `( backquote) . See the following which will explain you the difference between single quotes and dounle quotes .

echo " Command substitution uses ` ` while term is evaluated using $TERM "
Command substitution uses while term is evaluated using axz100

The pair of ` ` is replaced by a null command and $TERM is replaced by axz100 i.e ` ` and $ is interpreeted by double quotes (" "). Now try the same exampple in single quotes .

echo ' Command substitution uses ` ` while term is evaluated using $TERM '
Command substitution uses ` ` while term is evaluated using $TERM
Its often crucial to select the right type of quote , keep in mind that single quotes protects all special characters . Double quotes interpret pair of nackquotes ( ``) as command substitution characters and $ as variable prefix. There is also a relationship between these two quotes , double quotes protects single quotes and single quotes protects double quotes.

Redirection : The Three Standard Files -

Before we discuss about redirection lets understand what terminal means , since we often used this word in our discussions . In the context of redirection , the terminal is a generic name that represents the screen , display or keyboard . Just as we refer to a diretory as a file , we will also sometimes refer to the keyboard as a terminal .

We see command output and error messages on the terminal ( display ) and we sometimes provide command input through the terminal ( keyboard ) . The shell associates three files with the terminal - two for the display and one for the keyboard . Even though our terminal is also represented by a specific device name ( /dev/tty ) , commands don't usually read from or write to this file . They perform all terminal - related activity with the three files that the shell makes available to every command .

  • Standard Input - The file ( or stream ) representing input , which is connected to the keyboard .
  • Standard Output - The file ( or stream ) representing output , which is connected to the display .
  • Standard Error - The file ( or stream ) representing error messages that emanate from the command or shell . This is also connected to the display .

Every command that uses streams will always find these files open and available . The files are closed when the command completes execution .The moment it sees some special characters int he command line , it connects to default device and start working . You , as a user , have to instrcut the shell to do that by using symbol like > and < in the command line .

Standard Input -

We have used the cat and wc command to read disk files .These commands have an additional method of taking input . When they are used without arguments , they read the file representing the standard input . This file is indeed special ; it can represent three input sources :

  • The keyboard , the default source .
  • A file using redirection with the < symbol ( a metacharacter ) .
  • Another program using a pipeline .

When you use wc without an argument and have no special symbol like the < and | in the command line , wc obtains its input form the default source . You have to provide this input from the keyboard and mark the end of input with [ctrl-d] :

   $ wc 
   standard input can be redirected 
   It can come from a file 
   or a pipeline 
     3    14    71

We have used wc before , but then it showed the filename in the fourth column . This time no filename was specified , so no filename was output . The shell's manipulative nature finds place here. It can reassign the standard input file to a disk file . This means it can redirect the standard input to originate from a file on disk . This reassignment or redirection requires the < symbol :

   $ wc < test.txt        File containing above 3 lines 
          3   14   71

The filename is missing once again , which means that wc didn't open the test.txt . It read the standard input file as a stream only when shell reassigned this stream to a disk file . Now question arise why one should bother to redirect the standard input from a file if the command can read file itself ? The answer is that there are times when you need to keep the command ignorant of the source of its input .

How Input Redirection works :

command : wc < test.txt

  • On seeing the < , the shell opens the disk file , test.txt , for reading
  • It unplugs the standard input file from its default source and assigns it to test.txt
  • wc reads from standard input which has earlier been reassigned by the shell to test.txt

The important thing here is that wc has no idea where the stream cam from ; it is not even aware that the shell had to open the file test.txt on its behalf !

Standard Output -

All commands displaying output on the terminal actually write to the standard output file as a stream of characters , and not directly to the terminal as such . There are three possible destinations of this stream :

  • The terminal , the default destination .
  • A file using the redirection symbol > and >>
  • As input to another program using a pipeline

The shell can effect redirection of this stream when it see the > or >> symbols in the command line . You can replace the default destination ( the terminal ) with any file by using the > ( right chevron ) operator , followed by filename :

   $ wc test.txt > newfile 
   $ cat newfile 
          3   14   71   test.txt 

The first command send the word count of test.txt to newfile ; nothing appears on the terminal screen . If the output file doesn't exist , the shell creates it before executing the command . If it exists , the shell overwrites it , so use this operator with caution . The shell also provides the >> symbol ( the right chevron used twice ) to append a file :

   wc test.txt >> newfile                        Doesn't disturb existing contents 

How Output Redirection works :

command : wc test.txt > newfile

  • On seeing the > , the shell opens the disk file , newfile , for writing
  • It unplugs the standard output file from its default destination and assigns it to newfile
  • wc opens the file test.txt for reading .
  • wc writes to standard output which has earlier been reassigned by the shell to the newfile .

The standard output of one command can also be used by another command as its standard input . This is the third destination of standard output and is taken up in the discussion pipes .

Standard Error -

When you enter an incorrect command or try to open a nonexistent file , certain diagnostic messages shown up on the screen . This is standard error stream whose default destination is the terminal . Trying to "cat" a nonexistent file produces the error stream :

   $ cat samplex.txt 
   cat: cannot open samplex.txt 

cat fails to open the file and writes to standard error .Using the redirection symbol for standard ouput obviously won't do :

   $ cat samplex.txt > errorfile 
   cat: cannot open samplex.txt            Error stream can't be captured with > 

The diagnostic output has not been sent to errorfile . It's obvious that standard error can't be redirected in the same way standard output can ( with > or >> ) . Even though standard output and standard error use the terminal as the default destination , the shell processes a mechanism for capturing them indiviually . Redirecting standard error requires the use of the 2> symbol .

   $ cat samplex.txt 2>errorfile 
   $ cat errorfile  
   cat: cannot open samplex.txt            

This works . You can also append diagnostic output in a manner similar to the one in which you append standard output

   cat samplex.txt 2>> errorfile 


/dev/null AND /dev/tty : Two Special Files

/dev/null - Quite often , and especially in shell programming , you will like to check whether a program runs successfully without seeing its output on the screen . You may not want to save this output in a file either . You have special file that simply accepts any return without growing in size - the file /dev/null :

   $ cat foo1 foo2 >/dev/null
   $ cat /dev/null 
   $ _                            size is always zero 

Check the file size , its always zer . /dev/null simply incinerates all output written to it . Whether you direct or append output to this file , its size always remains zero . This facility is useful in redirecting error messages away from terminal so they don't appear on the screen ./dev/null is actually a pseudo-device because , unlike all other device file's , It's not associated with any physical device .

/dev/tty The second soecial file in the UNIX system is the one indicating one's terminal - /dev/tty . BUt make no mistake : This is not the file that represents standard output or standard error . Commands generally don't write to this file , but you will need to redirect some statements in shell scripts to this file .

Consider , for instance , that mangesh is working on terminal /dev/pts/1 and sambit on /dev/pts/2 . However, both mangesh and sambit can refer to their own terminals with the same filename - /dev/tty . Thus , if mangesh issues the command .

   who > /dev/tty 

the list of users is sent to the terminal he is currently using - /dev/pts/1 . Similarly , sambit can use an identical command to see the output on her terminal , /dev/pts/2 . Like /dev/null , /dev/tty can be accessed independently by several users without conflict .

You may ask why one should need to specifically redirect output to one's own terminal since the default output goes to the terminal anyway . The answer is that sometimes you need to specify that explicitly as the following real-world example suggests .

Consider redirecting a shell script to a file , say , by using foo.sh > redirect.txt . Redirecting a script implies redirecting thestandard output of all statements in the script . That's not always desirable .Your script may contain some echo commands that provide helpful messages for the user ,and you would obviously like to see them on the terminal . If these statements are explicitly redirected to /dev/tty inside the script , redirecting the script won't affect these statements . We wil use this feature later in our shell scripts .

Pipes -

Is there any way by which we can combine two commands and output of one command can be given as input to other ? The answer is yes , In our previous discussion ( Chapter - 2) we have seen a symbol | which can connects the two commands . Lets try to learn in detail how it works and how helpful is this feature .

We know who command , which displays list of users , one user per line . Let's redirect the output of who in a file user.txt.

who > user.txt

Now If I want to get the count of file contents I need to run wc command with -l option . Lets give the user.txt file as an input to wc .

wc -l < user.txt

We have successfully counted the users who are logged in to the system using an intermediate files . Now this method has taken 2 steps which needs an intermediate file and then we need to provide this file to wc to get the count . This method of running the two commands seperately have 2 disadvantages -

  • For Long running commands , this process can be slow . The second command can't start unless the first has completed its work .
  • You need and intermediate files that has to be removed after completion of task .When handling large ( long) jobs , temporarly files can build up easily and consumes disk space in no time ..

Now Here is the use of our special symbol | . The shell connects who and wc command using a special operator | . Pipe avoids creation of the disk file . We can use who and wc in tandem so that one takes input from other .

who | wc -l               No intermediate Files created

Here the output the who has been passed directly as an input to wc command . Its the shell who sets up this connection and the comand have no knowledge of it .

We can even redirect the result of above command to a file -

who | wc -l > count.txt      

In a pipeline , all programs run simultaneously. Pipe also has a built-in mechanism to control the flow of the stream.Since a pipe is both being read and written , the reader and writer have to act in unison . If one operates faster than the other , then the appropriate driver has to re adjust the flow.This happens when you run following command .

ls | more    

Since the standard output from more freezes as long as you don't scroll forward, the kernel makes sure that ls writes to pipe only as musch as more can absorb at a time.

tee : Creating a Tee -

tee is an external command and not a feature of the shell . It handles a character stream by duplicating its input . It saves one copy in a file and writes the other to standard output . Being also a filter tee can be placed anywhere in apipeline . tee doesn't perform any filtering action on its input , it gives out exactly what it takes .

The following command sequence uses tee to display the output of who and saves this output in file as well :

who | tee user.txt   

Command Substitution -

Consider a simple example - Suppose you want to print following statement with date command like this

Today's Date is Thu Dec 25 08 :05 : 17 IST 2014      

The last part of the statement ( highlighted in bold) is an output of date command .How we have incorporated the result of date command in echo statement ? The answer is with command substituion we can do this . Use the following expression `date` as an argument to echo .

  echo Today's date is `date`
  Today's Date is Thu Dec 25 08 :05 : 17 IST 2014 

If you look into the above statement the ` ` ( backquote or backticks) is the character that shell searches in above statement , this is another special character which does the command substituion for us . The shell executes the command given in backquotes and replaces the enclosed command line with the output of the command .For command substitution to work , the command so "backquoted" must use standard output. That's why date command substituion worked.

You can even combine the two commands using pipeline operator and print its result in echo .

echo " There are `ls | wc -l ` files in the current directory ."   

The command worked properly even though we have enclosed the statement in double quotes .Now lest see what we get when we run it in single quotes .

  echo 'There are `ls | wc -l ` files in the current directory .'
  Output - There are `ls | wc -l ` files in the current directory 

The result was as expected , we have alreday seen in this chapter that when ` ( backquotes) is placed in " double quotes shell interprets it and run them , however if same thing is placed within single quotes , shell takes it as literal and prints it as it is .

This feature is very useful in shell programming , you will see more of this feature in our upcoming chapters.

Command substiution is enabled when backquotes are used within double quotes . If you use single quotes , it's not.
  • Instead of using `command` for command substituion , one can used the POSIX which recommends you to place your command in parantheses in the form $(command)
  • echo Today's date is $(date)
  • Its just the matter of choice which way you want to execute your command in echo statement .

Shell Variable -

In our previous discussions we have seen some built in shell variables like TERM and SHELL . Now lets see some features of variable assignment and its usage in shell programming .

In shell variable assignment is of the form variable=value (no space on both sides of = ) , but to evaluate the variable we need to add $as prefix to the variable name .

  i=10  ---------- No $ required for assignment
  echo $i ---------- but need for evaluation 

we can even assign the value of another variable

  count=$i ---------- Assinging a avlue to another variable
  echo $count 

Some Important points about shell variables -

  • when the shell reads the command line , it interprets any word precceded by a $ as a variable and replaces the word by the value of the variable.
  • Variable name begins with a letter , can contains numeral and underscore _ as other character.
  • In Shell we don't need to prefix them with data types like char, string and int You don't need to define them before using them.
  • we even don't decalre them before we use them .
  • All variables are string type by default .
  • All shell variables are intialize to null string by default. We can explicitly assign them to null with x=" " or x=' ' . You can even use shorthand as x= ( A null string)

A variable can be removed with unset and protected from reassignment by readonly . Both are shell internal commands.

  unset x       x is now undefined
  readonly x    x can't be reassigned 
By convention, variable names used by the unix system and software packages are in uppercase. You are advised to use lowercase varibale names in your shell script simply to distinguish them from system variables.

Use of shell variables -

Setting Parameter , If a pathname is used several times in a script , you should assign it to a variable. We can then use that variable as an argument to the command

  log="/v/mangesh/programs/scripts/ "
  cd $log ; pwd 

By capturing the path name we can use this variable in our script rather than typing it every where , apart from this there is one more advantage that , whenever there is a change in the path or directory we can directly edit in the variable definition and evrything will work in the same way as before .

Using command substitution -

We can also use the feature of command substitution to set variable


Concatenating Variables and Strings -

In our shell scripts we often need to concatenate a variable with other variable or strings.To concatenate two variables you can place them side by side .

  name=Mangesh ; ext =.txt  Two Assignment in one line
  file = $name$ext          Result will be Mangesh.txt 
  file =${name}$ext -       you can use curly braces to delimit them

In this topic we have covered the most common and msotly used shell characters . There are many more character that the shell looks for that have been ignored here.The next chapter examines the process; the shell is also process. Next chapter discuss about the process.

Untitled Document Scroll To Top Untitled Document