Wednesday, January 9, 2013

Shell Programming:DECISION-MAKING & LOOP CONSTRUCTS


DECISION-MAKING & LOOP CONSTRUCTS:

* Shell programs can perform conditional tests on their arguments and variables and execute different commands based on the results. For example:

   if [ "$1" = "hyena" ]
   then
     echo "Sorry, hyenas not allowed."
     exit
   elif [ "$1" = "jackal" ]
   then
     echo "Jackals not welcome."
     exit
   else
     echo "Welcome to Bongo Congo."
   fi 
   echo "Do you have anything to declare?"

-- checks the command line to see if the first argument is "hyena" or "jackal" and bails out, using the "exit" command, if they are. Other arguments allow the rest of the file to be executed. Note how "$1" is enclosed in double quotes, so the test will not generate an error message if it yields a null result.
There are a wide variety of such test conditions:

   [ "$shvar" = "fox" ]    String comparison, true if match.
   [ "$shvar" != "fox" ]   String comparison, true if no match.
   [ "$shvar" = "" ]       True if null variable.
   [ "$shvar" != "" ]      True if not null variable.

   [ "$nval" -eq 0 ]       Integer test; true if equal to 0.
   [ "$nval" -ge 0 ]       Integer test; true if greater than or equal to 0.
   [ "$nval" -gt 0 ]       Integer test; true if greater than 0.
   [ "$nval" -le 0 ]       Integer test; true if less than or equal to 0.
   [ "$nval" -lt 0 ]       Integer test; true if less than to 0.
   [ "$nval" -ne 0 ]       Integer test; true if not equal to 0.

   [ -d tmp ]              True if "tmp" is a directory.
   [ -f tmp ]              True if "tmp" is an ordinary file.
   [ -r tmp ]              True if "tmp" can be read.
   [ -s tmp ]              True if "tmp" is nonzero length.
   [ -w tmp ]              True if "tmp" can be written.
   [ -x tmp ]              True if "tmp" is executable.

Incidentally, in the example above:

   if [ "$1" = "hyena" ]

-- there is a potential pitfall in that a user might enter, say, "-d" as a command-line parameter, which would cause an error when the program was run. Now there is only so much that can be done to save users from their own clumsiness, and "bullet-proofing" simple example programs tends to make them not so simple any more, but there is a simple if a bit cluttered fix for such a potential pitfall. It is left as an exercise for the reader.
There is also a "case" control construct that checks for equality with a list of items. It can be used with the example at the beginning of this section:

   case "$1" 
   in
     "gorilla")  echo "Sorry, gorillas not allowed."
                 exit;;
     "hyena")    echo "Hyenas not welcome."
                 exit;;
     *)          echo "Welcome to Bongo Congo.";;
   esac

The string ";;" is used to terminate each "case" clause.
* The fundamental loop construct in the shell is based on the "for" command. For example:

   for nvar in 1 2 3 4 5
   do
     echo $nvar
   done

-- echoes the numbers 1 through 5. The names of all the files in the current directory could be displayed with:

   for file in *
   do
     echo $file
   done

One nice little feature of the shell is that if the "in" parameters are not specified for the "for" command, it just cycles through the command-line arguments.
* There is a "break" command to exit a loop if necessary:

   for file
   do
     if [ "$file" = punchout ]
     then 
       break
     else
       echo $file
     fi
   done

There is also a "continue" command that starts the next iteration of the loop immediately. There must be a command in the "then" or "else" clauses, or the result is an error message. If it's not convenient to actually do anything in the "then" clause, a ":" can be used as a "no-op" command:

   then
     :
   else

* There are two other looping constructs available as well, "while" and "until". For an example of "while":

   n=10
   while [ "$n" -ne 0 ]
   do
     echo $n
     n=`expr $n - 1`
   done

-- counts down from 10 to 1. The "until" loop has similar syntax but tests for a false condition:

   n=10
   until [ "$n" -eq 0 ]
   do
   ...

Shell Programming:COMMAND-LINE ARGUMENTS

COMMAND-LINE ARGUMENTS:

* In general, shell programs operate in a "batch" mode, that is, without interaction from the user, and so most of their parameters are obtained on the command line. Each argument on the command line can be seen inside the shell program as a shell variable of the form "$1", "$2", "$3", and so on, with "$1" corresponding to the first argument, "$2" the second, "$3" the third, and so on.
There is also a "special" argument variable, "$0", that gives the name of the shell program itself. Other special variables include "$#", which gives the number of arguments supplied, and "$*", which gives a string with all the arguments supplied.
Since the argument variables are in the range "$1" to "$9", so what happens there's more than 9 arguments? No problem, the "shift" command can be used to move the arguments down through the argument list. That is, when "shift" is executed, then the second argument becomes "$1", the third argument becomes "$2", and so on; and if a "shift" is performed again, the third argument becomes "$1"; and so on. A count can be specified to cause a multiple shift:

   shift 3

-- shifts the arguments three times, so that the fourth argument ends up in "$1".

Shell Programming:COMMAND SUBSTITUTION


COMMAND SUBSTITUTION:

* The next step is to consider shell command substitution. Like any programming language, the shell does exactly what it is told to do, and so it is important to be very specific when telling it to do something. As an example, consider the "fgrep" command, which searches a file for a string. For example, to search a file named "source.txt" for the string "Coyote", enter:
   fgrep Coyote source.txt
-- and it would print out the matching lines. However, suppose we wanted to search for "Wile E. Coyote". If we did this as:
   fgrep Wile E. Coyote source.txt
-- we'd get an error message that "fgrep" couldn't open "E.". The string has to be enclosed in double-quotes (""):
   fgrep "Wile E. Coyote" source.txt
If a string has a special character in it, such as "*" or "?", that must be interpreted as a "literal" and not a wildcard, the shell can get a little confused. To ensure that the wildcards are not interpreted, the wildcard can either be "escaped" with a backslash ("\*" or "\?") or the string can be enclosed in single quotes, which prevents the shell from interpreting any of the characters within the string. For example, if:
   echo "$shvar"
-- is executed from a shell program, it would output the value of the shell variable "$shvar". In contrast, executing:
   echo '$shvar'
-- the output is the string "$shvar".
* Having considered "double-quoting" and "single-quoting", let's now consider "back-quoting". This is a little tricky to explain. As a useful tool, consider the "expr" command, which can be used to perform simple math from the command line:
   expr 2 + 4
This displays the value "6". There must be spaces between the parameters; in addition, to perform a multiplication the "*" has to be "escaped" so the shell doesn't interpret it:
   expr 3 \* 7 
Now suppose the string "expr 12 / 3" has been stored in a shell variable named "shcmd"; then executing:
   echo $shcmd
-- or:
   echo "$shcmd"
-- would simply produce the text "expr 12 / 3". If single-quotes were used:
   echo '$shcmd'
-- the result would be the string "$shcmd". However, if back-quotes, the reverse form of a single quote, were used:
   echo `$shcmd`
-- the result would be the value "4", since the string inside "shcmd" is executed. This is an extremely powerful technique that can be very confusing to use in practice.

Shell Programming:SHELL VARIABLES

SHELL VARIABLES:

* The first useful command to know about in building shell programs is "echo", which can be used to produce output from a shell program:

   echo "This is a test!"

This sends the string "This is a test!" to standard output. It is recommended to write shell programs that generate some output to inform the user of what they are doing.
The shell allows variables to be defined to store values. It's simple, just declare a variable is assign a value to it:

   shvar="This is a test!"

The string is enclosed in double-quotes to ensure that the variable swallows the entire string (more on this later), and there are no spaces around the "=". The value of the shell variable can be obtained by preceding it with a "$":

   echo $shvar

This displays "This is a test!". If no value had been stored in that shell variable, the result would have simply been a blank line. Values stored in shell variables can be used as parameters to other programs as well:

   ls $lastdir

The value stored in a shell variable can be erased by assigning the "null string" to the variable:

   shvar=""

There are some subtleties in using shell variables. For example, suppose a shell program performed the assignment:

   allfiles=*

-- and then performed:

   echo $allfiles

This would echo a list of all the files in the directory. However, only the string "*" would be stored in "allfiles". The expansion of "*" only occurs when the "echo" command is executed.
Another subtlety is in modifying the values of shell variables. Suppose we have a file name in a shell variable named "myfile" and want to copy that file to another with the same name, but with "2" tacked on to the end. We might think to try:

   mv $myfile $myfile2

-- but the problem is that the shell will think that "myfile2" is a different shell variable, and this won't work. Fortunately, there is a way around this; the change can be made as follows:

   mv $myfile ${myfile}2

A UNIX installation will have some variables installed by default, most importantly $HOME, which gives the location of a particular user's home directory.
As a final comment on shell variables, if one shell program calls another and the two shell programs have the same variable names, the two sets of variables will be treated as entirely different variables. To call other shell programs from a shell program and have them use the same shell variables as the calling program requires use of the "export" command:

   shvar="This is a test!"
   export shvar
   echo "Calling program two."
   shpgm2
   echo "Done!"

If "shpgm2" simply contains:

   echo $shvar

-- then it will echo "This is a test!".

Shell Programming:GETTING STARTED

GETTING STARTED:

* The first thing to do in understanding shell programs is to understand the elementary system commands that can be used in them. A list of fundamental UNIX system commands follows:
  ls         # Give a simple listing of files.
  cp         # Copy files.
  mv         # Move or rename files.
  rm         # Remove files.  
  rm -r      # Remove entire directory subtree.
  cd         # Change directories.
  pwd        # Print working directory.
  cat        # Lists a file or files sequentially.
  more       # Displays a file a screenfull at a time.
  pg         # Variant on "more".
  mkdir      # Make a directory.
  rmdir      # Remove a directory.
The shell executes such commands when they are typed in from the command prompt with their appropriate parameters, which are normally options and file names.
* The shell also allows files to be defined in terms of "wildcard characters" that define a range of files. The "*" wildcard character substitutes for any string of characters, so:
   rm *.txt
-- deletes all files that end with ".txt". The "?" wildcard character substitutes for any single character, so:
   rm book?.txt
-- deletes "book1.txt", "book2.txt", and so on. More than one wildcard character can be used at a time, for example:
   rm *book?.txt
-- deletes "book1.txt", "mybook1.txt", "bigbook2.txt", and so on.
* Another shell capability is "input and output redirection". The shell, like other UNIX utilities, accepts input by default from what is called "standard input", and generates output by default to what is called "standard output". These are normally defined as the keyboard and display, respectively, or what is referred to as the "console" in UNIX terms. However, standard input or output can be "redirected" to a file or another program if needed. Consider the "sort" command. This command sorts a list of words into alphabetic order; typing in:
   sort
   PORKY
   ELMER
   FOGHORN
   DAFFY
   WILE
   BUGS
   <CTL-D>
-- spits back:
   BUGS
   DAFFY
   ELMER
   FOGHORN
   PORKY
   WILE
Note that the CTL-D key input terminates direct keyboard input. It is also possible to store the same words in a file and then "redirect" the contents of that file to standard input with the "<" operator:
   sort < names.txt
This would list the sorted names to the display as before. They can be redirected to a file with the ">" operator:
   sort < names.txt > output.txt
They can also be appended to an existing file using the ">>" operator:
   sort < names.txt >> output.txt
In these cases, there's no visible output, since the command just executes and ends. However, if that's a problem, it can be fixed by connecting the "tee" command to the output through a "pipe", designated by "|". This allows the standard output of one command to be chained into the standard input of another command. In the case of "tee", it accepts text into its standard input and then dumps it both to a file and to standard output:
   sort < names.txt | tee output.txt
So this both displays the names and puts them in the output file. Many commands can be chained together to "filter" information through several processing steps. This ability to combine the effects of commands is one of the beauties of shell programming. By the way, "sort" has some handy additional options:
   sort -u    # Eliminate redundant lines in output.
   sort -r    # Sort in reverse order.
   sort -n    # Sort numbers. 
   sort -k 2  # Skip first field in sorting.
* If a command generates an error, it is displayed to what is called "standard error", instead of standard output, which defaults to the console. It will not be redirected by ">". However, the operator "2>" can be used to redirect the error message. For example:
   ls xyzzy 2> /dev/null
-- will give an error message if the file "xyzzy" doesn't exist, but the error will be redirected to the file "/dev/null". This is actually a "special file" that exists under UNIX where everything sent to it is simply discarded.
* The shell permits the execution of multiple commands sequentially on one line by chaining them with a ";":
   rm *.txt ; ls
A time-consuming program can also be run in a "parallel" fashion by following it with a "&":
   sort < bigfile.txt > output.txt &
* These commands and operations are essential elements for creating shell programs. They can be stored in a file and then executed by the shell. To tell the shell that the file contains commands, just mark it as "executable" with the "chmod" command. Each file under UNIX has a set of "permission" bits, listed by an "ls -l" -- the option providing file details -- as:
   rwxrwxrwx
The "r" gives "read" permission, the "w" gives "write" permission, and the "x" gives "execute" permission. There are three sets of these permission bits, one for the user, one for other members of a local group of users on a system, and one for everyone who can access the system -- remember that UNIX was designed as a multiuser environment.
The "chmod" command can be used to set these permissions, with the permissions specified as an octal code. For example:
   chmod 644 myfile.txt
This sets both read and write permission on the file for the user, but everybody else on the system only gets read permission. The same octal scheme can be used to set execute permission, though it's simpler just to use chmod "+x" option:
   chmod +x mypgm
This done, if the name "mypgm" is entered at the prompt, the shell reads the commands out of "mypgm" and executes them. The execute permission can be removed with the "-x" option.
For example, suppose we want to be able to inspect the contents of a set of archive files stored in the directory "/users/group/archives". We could create a file named "ckarc" and store the following command string in it:
  ls /users/group/archives | pg
This is a very simple shell program. As noted, the shell has control constructs, supports storage variables, and has several options that can be set to allow much more sophisticated programs. The following sections describe these features in a quick outline fashion.

Shell Programming:Introduction

* The UNIX operating system provides a flexible set of simple tools to perform a wide variety of system-management, text-processing, and general-purpose tasks. These simple tools can be used in very powerful ways by tying them together programmatically, using "shell scripts" or "shell programs".
The UNIX "shell" itself is a user-interface program that accepts commands from the user and executes them. It can also accept the same commands written as a list in a file, along with various other statements that the shell can interpret to provide input, output, decision-making, looping, variable storage, option specification, and so on. This file is a shell program.
Shell programs are, like any other programming language, useful for some things but not for others. They are excellent for system-management tasks but not for general-purpose programming of any sophistication. Shell programs, though generally simple to write, are also tricky to debug and slow in operation.
There are three versions of the UNIX shell: the original "Bourne shell (sh)", the "C shell (csh)" that was derived from it, and the "Korn shell (ksh)" that is in predominant use. The Bourne shell is in popular use as the freeware "Bourne-again shell" AKA "bash".

Visuel Basic:Loop

Do...Loop:

Used to execute a block of statements an unspecified number of times.

Do While condition
     statements
Loop

First, the condition is tested; if condition is True, then the statements are executed. When it gets to the Loop it goes back to the Do and tests condition again. If condition is False on the first pass, the statements are never executed.




For...Next:

When the number of iterations of the loop is known, it is better to use the For...Next rather than the Do...Loop.

For counter = start To end
     statements
Next

1) The counter is set to the value of start.
2) Counter is checked to see if it is greater than end; if yes, control passes to the statement after the Next; if not the statements are executed.
3)At Next, counter is incremented and goes back to step 2).

Powered by Blogger.