15.9. Miscellaneous Commands

Command that fit in no special category

jot, seq

These utilities emit a sequence of integers, with a user-selectable increment.

The default separator character between each integer is a newline, but this can be changed with the -s option.

 bash$ seq 5
 1
 2
 3
 4
 5
 
 
 
 bash$ seq -s : 5
 1:2:3:4:5
 	      

Both jot and seq come in handy in a for loop.


Example 15-50. Using seq to generate loop arguments

   1 #!/bin/bash
   2 # Using "seq"
   3 
   4 echo
   5 
   6 for a in `seq 80`  # or   for a in $( seq 80 )
   7 # Same as   for a in 1 2 3 4 5 ... 80   (saves much typing!).
   8 # May also use 'jot' (if present on system).
   9 do
  10   echo -n "$a "
  11 done      # 1 2 3 4 5 ... 80
  12 # Example of using the output of a command to generate 
  13 # the [list] in a "for" loop.
  14 
  15 echo; echo
  16 
  17 
  18 COUNT=80  # Yes, 'seq' also accepts a replaceable parameter.
  19 
  20 for a in `seq $COUNT`  # or   for a in $( seq $COUNT )
  21 do
  22   echo -n "$a "
  23 done      # 1 2 3 4 5 ... 80
  24 
  25 echo; echo
  26 
  27 BEGIN=75
  28 END=80
  29 
  30 for a in `seq $BEGIN $END`
  31 #  Giving "seq" two arguments starts the count at the first one,
  32 #+ and continues until it reaches the second.
  33 do
  34   echo -n "$a "
  35 done      # 75 76 77 78 79 80
  36 
  37 echo; echo
  38 
  39 BEGIN=45
  40 INTERVAL=5
  41 END=80
  42 
  43 for a in `seq $BEGIN $INTERVAL $END`
  44 #  Giving "seq" three arguments starts the count at the first one,
  45 #+ uses the second for a step interval,
  46 #+ and continues until it reaches the third.
  47 do
  48   echo -n "$a "
  49 done      # 45 50 55 60 65 70 75 80
  50 
  51 echo; echo
  52 
  53 exit 0

A simpler example:

   1 #  Create a set of 10 files,
   2 #+ named file.1, file.2 . . . file.10.
   3 COUNT=10
   4 PREFIX=file
   5 
   6 for filename in `seq $COUNT`
   7 do
   8   touch $PREFIX.$filename
   9   #  Or, can do other operations,
  10   #+ such as rm, grep, etc.
  11 done


Example 15-51. Letter Count"

   1 #!/bin/bash
   2 # letter-count.sh: Counting letter occurrences in a text file.
   3 # Written by Stefano Palmeri.
   4 # Used in ABS Guide with permission.
   5 # Slightly modified by document author.
   6 
   7 MINARGS=2          # Script requires at least two arguments.
   8 E_BADARGS=65
   9 FILE=$1
  10 
  11 let LETTERS=$#-1   # How many letters specified (as command-line args).
  12                    # (Subtract 1 from number of command line args.)
  13 
  14 
  15 show_help(){
  16 	   echo
  17            echo Usage: `basename $0` file letters  
  18            echo Note: `basename $0` arguments are case sensitive.
  19            echo Example: `basename $0` foobar.txt G n U L i N U x.
  20 	   echo
  21 }
  22 
  23 # Checks number of arguments.
  24 if [ $# -lt $MINARGS ]; then
  25    echo
  26    echo "Not enough arguments."
  27    echo
  28    show_help
  29    exit $E_BADARGS
  30 fi  
  31 
  32 
  33 # Checks if file exists.
  34 if [ ! -f $FILE ]; then
  35     echo "File \"$FILE\" does not exist."
  36     exit $E_BADARGS
  37 fi
  38 
  39 
  40 
  41 # Counts letter occurrences .
  42 for n in `seq $LETTERS`; do
  43       shift
  44       if [[ `echo -n "$1" | wc -c` -eq 1 ]]; then             #  Checks arg.
  45              echo "$1" -\> `cat $FILE | tr -cd  "$1" | wc -c` #  Counting.
  46       else
  47              echo "$1 is not a  single char."
  48       fi  
  49 done
  50 
  51 exit $?
  52 
  53 #  This script has exactly the same functionality as letter-count2.sh,
  54 #+ but executes faster.
  55 #  Why?

getopt

The getopt command parses command-line options preceded by a dash. This external command corresponds to the getopts Bash builtin. Using getopt permits handling long options by means of the -l flag, and this also allows parameter reshuffling.


Example 15-52. Using getopt to parse command-line options

   1 #!/bin/bash
   2 # Using getopt
   3 
   4 # Try the following when invoking this script:
   5 #   sh ex33a.sh -a
   6 #   sh ex33a.sh -abc
   7 #   sh ex33a.sh -a -b -c
   8 #   sh ex33a.sh -d
   9 #   sh ex33a.sh -dXYZ
  10 #   sh ex33a.sh -d XYZ
  11 #   sh ex33a.sh -abcd
  12 #   sh ex33a.sh -abcdZ
  13 #   sh ex33a.sh -z
  14 #   sh ex33a.sh a
  15 # Explain the results of each of the above.
  16 
  17 E_OPTERR=65
  18 
  19 if [ "$#" -eq 0 ]
  20 then   # Script needs at least one command-line argument.
  21   echo "Usage $0 -[options a,b,c]"
  22   exit $E_OPTERR
  23 fi  
  24 
  25 set -- `getopt "abcd:" "$@"`
  26 # Sets positional parameters to command-line arguments.
  27 # What happens if you use "$*" instead of "$@"?
  28 
  29 while [ ! -z "$1" ]
  30 do
  31   case "$1" in
  32     -a) echo "Option \"a\"";;
  33     -b) echo "Option \"b\"";;
  34     -c) echo "Option \"c\"";;
  35     -d) echo "Option \"d\" $2";;
  36      *) break;;
  37   esac
  38 
  39   shift
  40 done
  41 
  42 #  It is usually better to use the 'getopts' builtin in a script.
  43 #  See "ex33.sh."
  44 
  45 exit 0

See Example 9-14 for a simplified emulation of getopt.

run-parts

The run-parts command [1] executes all the scripts in a target directory, sequentially in ASCII-sorted filename order. Of course, the scripts need to have execute permission.

The cron daemon invokes run-parts to run the scripts in the /etc/cron.* directories.

yes

In its default behavior the yes command feeds a continuous string of the character y followed by a line feed to stdout. A control-c terminates the run. A different output string may be specified, as in yes different string, which would continually output different string to stdout.

One might well ask the purpose of this. From the command line or in a script, the output of yes can be redirected or piped into a program expecting user input. In effect, this becomes a sort of poor man's version of expect.

yes | fsck /dev/hda1 runs fsck non-interactively (careful!).

yes | rm -r dirname has same effect as rm -rf dirname (careful!).

Warning

Caution advised when piping yes to a potentially dangerous system command, such as fsck or fdisk. It might have unintended consequences.

Note

The yes command parses variables. For example:

 bash$ yes $BASH_VERSION
 3.1.17(1)-release
 3.1.17(1)-release
 3.1.17(1)-release
 3.1.17(1)-release
 3.1.17(1)-release
 . . .
 	      

This "feature" may not be particularly useful.

banner

Prints arguments as a large vertical banner to stdout, using an ASCII character (default '#'). This may be redirected to a printer for hardcopy.

printenv

Show all the environmental variables set for a particular user.

 bash$ printenv | grep HOME
 HOME=/home/bozo
 	      

lp

The lp and lpr commands send file(s) to the print queue, to be printed as hard copy. [2] These commands trace the origin of their names to the line printers of another era.

bash$ lp file1.txt or bash lp <file1.txt

It is often useful to pipe the formatted output from pr to lp.

bash$ pr -options file1.txt | lp

Formatting packages, such as groff and Ghostscript may send their output directly to lp.

bash$ groff -Tascii file.tr | lp

bash$ gs -options | lp file.ps

Related commands are lpq, for viewing the print queue, and lprm, for removing jobs from the print queue.

tee

[UNIX borrows an idea from the plumbing trade.]

This is a redirection operator, but with a difference. Like the plumber's tee, it permits "siponing off" to a file the output of a command or commands within a pipe, but without affecting the result. This is useful for printing an ongoing process to a file or paper, perhaps to keep track of it for debugging purposes.

                              (redirection)
                             |----> to file
                             |
   ==========================|====================
   command ---> command ---> |tee ---> command ---> ---> output of pipe
   ===============================================
 	      

   1 cat listfile* | sort | tee check.file | uniq > result.file

(The file check.file contains the concatenated sorted "listfiles," before the duplicate lines are removed by uniq.)

mkfifo

This obscure command creates a named pipe, a temporary first-in-first-out buffer for transferring data between processes. [3] Typically, one process writes to the FIFO, and the other reads from it. See Example A-15.

   1 #!/bin/bash
   2 # This short script by Omair Eshkenazi.
   3 # Used in ABS Guide with permission (thanks!).
   4 
   5 mkfifo pipe1
   6 mkfifo pipe2
   7 
   8 (cut -d' ' -f1 | tr "a-z" "A-Z") >pipe2 <pipe1 &
   9 ls -l | tr -s ' ' | cut -d' ' -f3,9- | tee pipe1 |
  10 cut -d' ' -f2 | paste - pipe2
  11 
  12 rm -f pipe1
  13 rm -f pipe2
  14 
  15 # No need to kill background processes when script terminates (why not?).
  16 
  17 exit $?
  18 
  19 Now, invoke the script and explain the output:
  20 sh mkfifo-example.sh
  21 
  22 4830.tar.gz          BOZO
  23 pipe1   BOZO
  24 pipe2   BOZO
  25 mkfifo-example.sh    BOZO
  26 Mixed.msg BOZO

pathchk

This command checks the validity of a filename. If the filename exceeds the maximum allowable length (255 characters) or one or more of the directories in its path is not searchable, then an error message results.

Unfortunately, pathchk does not return a recognizable error code, and it is therefore pretty much useless in a script. Consider instead the file test operators.

dd

This is the somewhat obscure and much feared data duplicator command. Originally a utility for exchanging data on magnetic tapes between UNIX minicomputers and IBM mainframes, this command still has its uses. The dd command simply copies a file (or stdin/stdout), but with conversions. Possible conversions are ASCII/EBCDIC, [4] upper/lower case, swapping of byte pairs between input and output, and skipping and/or truncating the head or tail of the input file.

   1 # Converting a file to all uppercase:
   2 
   3 dd if=$filename conv=ucase > $filename.uppercase
   4 #                    lcase   # For lower case conversion

Some basic options to dd are:

  • if=INFILE

    INFILE is the source file.

  • of=OUTFILE

    OUTFILE is the target file, the file that will have the data written to it.

  • bs=BLOCKSIZE

    This is the size of each block of data being read and written, usually a power of 2.

  • skip=BLOCKS

    How many blocks of data to skip in INFILE before starting to copy. This is useful when the INFILE has "garbage" or garbled data in its header or when it is desirable to copy only a portion of the INFILE.

  • seek=BLOCKS

    How many blocks of data to skip in OUTFILE before starting to copy, leaving blank data at beginning of OUTFILE.

  • count=BLOCKS

    Copy only this many blocks of data, rather than the entire INFILE.

  • conv=CONVERSION

    Type of conversion to be applied to INFILE data before copying operation.

A dd --help lists all the options this powerful utility takes.


Example 15-53. A script that copies itself

   1 #!/bin/bash
   2 # self-copy.sh
   3 
   4 # This script copies itself.
   5 
   6 file_subscript=copy
   7 
   8 dd if=$0 of=$0.$file_subscript 2>/dev/null
   9 # Suppress messages from dd:   ^^^^^^^^^^^
  10 
  11 exit $?


Example 15-54. Exercising dd

   1 #!/bin/bash
   2 # exercising-dd.sh
   3 
   4 # Script by Stephane Chazelas.
   5 # Somewhat modified by ABS Guide author.
   6 
   7 infile=$0       # This script.
   8 outfile=log.txt # This output file left behind.
   9 n=3
  10 p=5
  11 
  12 dd if=$infile of=$outfile bs=1 skip=$((n-1)) count=$((p-n+1)) 2> /dev/null
  13 # Extracts characters n to p (3 to 5) from this script.
  14 
  15 # --------------------------------------------------------
  16 
  17 echo -n "hello world" | dd cbs=1 conv=unblock 2> /dev/null
  18 # Echoes "hello world" vertically.
  19 # Why? Newline after each character dd emits.
  20 
  21 exit 0

To demonstrate just how versatile dd is, let's use it to capture keystrokes.


Example 15-55. Capturing Keystrokes

   1 #!/bin/bash
   2 # dd-keypress.sh: Capture keystrokes without needing to press ENTER.
   3 
   4 
   5 keypresses=4                      # Number of keypresses to capture.
   6 
   7 
   8 old_tty_setting=$(stty -g)        # Save old terminal settings.
   9 
  10 echo "Press $keypresses keys."
  11 stty -icanon -echo                # Disable canonical mode.
  12                                   # Disable local echo.
  13 keys=$(dd bs=1 count=$keypresses 2> /dev/null)
  14 # 'dd' uses stdin, if "if" (input file) not specified.
  15 
  16 stty "$old_tty_setting"           # Restore old terminal settings.
  17 
  18 echo "You pressed the \"$keys\" keys."
  19 
  20 # Thanks, Stephane Chazelas, for showing the way.
  21 exit 0

The dd command can do random access on a data stream.
   1 echo -n . | dd bs=1 seek=4 of=file conv=notrunc
   2 #  The "conv=notrunc" option means that the output file
   3 #+ will not be truncated.
   4 
   5 # Thanks, S.C.

The dd command can copy raw data and disk images to and from devices, such as floppies and tape drives (Example A-5). A common use is creating boot floppies.

dd if=kernel-image of=/dev/fd0H1440

Similarly, dd can copy the entire contents of a floppy, even one formatted with a "foreign" OS, to the hard drive as an image file.

dd if=/dev/fd0 of=/home/bozo/projects/floppy.img

Other applications of dd include initializing temporary swap files (Example 28-2) and ramdisks (Example 28-3). It can even do a low-level copy of an entire hard drive partition, although this is not necessarily recommended.

People (with presumably nothing better to do with their time) are constantly thinking of interesting applications of dd.


Example 15-56. Securely deleting a file

   1 #!/bin/bash
   2 # blot-out.sh: Erase "all" traces of a file.
   3 
   4 #  This script overwrites a target file alternately
   5 #+ with random bytes, then zeros before finally deleting it.
   6 #  After that, even examining the raw disk sectors by conventional methods
   7 #+ will not reveal the original file data.
   8 
   9 PASSES=7         #  Number of file-shredding passes.
  10                  #  Increasing this slows script execution,
  11                  #+ especially on large target files.
  12 BLOCKSIZE=1      #  I/O with /dev/urandom requires unit block size,
  13                  #+ otherwise you get weird results.
  14 E_BADARGS=70     #  Various error exit codes.
  15 E_NOT_FOUND=71
  16 E_CHANGED_MIND=72
  17 
  18 if [ -z "$1" ]   # No filename specified.
  19 then
  20   echo "Usage: `basename $0` filename"
  21   exit $E_BADARGS
  22 fi
  23 
  24 file=$1
  25 
  26 if [ ! -e "$file" ]
  27 then
  28   echo "File \"$file\" not found."
  29   exit $E_NOT_FOUND
  30 fi  
  31 
  32 echo; echo -n "Are you absolutely sure you want to blot out \"$file\" (y/n)? "
  33 read answer
  34 case "$answer" in
  35 [nN]) echo "Changed your mind, huh?"
  36       exit $E_CHANGED_MIND
  37       ;;
  38 *)    echo "Blotting out file \"$file\".";;
  39 esac
  40 
  41 
  42 flength=$(ls -l "$file" | awk '{print $5}')  # Field 5 is file length.
  43 pass_count=1
  44 
  45 chmod u+w "$file"   # Allow overwriting/deleting the file.
  46 
  47 echo
  48 
  49 while [ "$pass_count" -le "$PASSES" ]
  50 do
  51   echo "Pass #$pass_count"
  52   sync         # Flush buffers.
  53   dd if=/dev/urandom of=$file bs=$BLOCKSIZE count=$flength
  54                # Fill with random bytes.
  55   sync         # Flush buffers again.
  56   dd if=/dev/zero of=$file bs=$BLOCKSIZE count=$flength
  57                # Fill with zeros.
  58   sync         # Flush buffers yet again.
  59   let "pass_count += 1"
  60   echo
  61 done  
  62 
  63 
  64 rm -f $file    # Finally, delete scrambled and shredded file.
  65 sync           # Flush buffers a final time.
  66 
  67 echo "File \"$file\" blotted out and deleted."; echo
  68 
  69 
  70 exit 0
  71 
  72 #  This is a fairly secure, if inefficient and slow method
  73 #+ of thoroughly "shredding" a file.
  74 #  The "shred" command, part of the GNU "fileutils" package,
  75 #+ does the same thing, although more efficiently.
  76 
  77 #  The file cannot not be "undeleted" or retrieved by normal methods.
  78 #  However . . .
  79 #+ this simple method would *not* likely withstand
  80 #+ sophisticated forensic analysis.
  81 
  82 #  This script may not play well with a journaled file system.
  83 #  Exercise (difficult): Fix it so it does.
  84 
  85 
  86 
  87 #  Tom Vier's "wipe" file-deletion package does a much more thorough job
  88 #+ of file shredding than this simple script.
  89 #     http://www.ibiblio.org/pub/Linux/utils/file/wipe-2.0.0.tar.bz2
  90 
  91 #  For an in-depth analysis on the topic of file deletion and security,
  92 #+ see Peter Gutmann's paper,
  93 #+     "Secure Deletion of Data From Magnetic and Solid-State Memory".
  94 #       http://www.cs.auckland.ac.nz/~pgut001/pubs/secure_del.html

See also the dd thread entry in the bibliography.

od

The od, or octal dump filter converts input (or files) to octal (base-8) or other bases. This is useful for viewing or processing binary data files or otherwise unreadable system device files, such as /dev/urandom, and as a filter for binary data.

   1 head -c4 /dev/urandom | od -N4 -tu4 | sed -ne '1s/.* //p'
   2 # Sample output: 1324725719, 3918166450, 2989231420, etc.
   3 
   4 # From rnd.sh example script, by Stéphane Chazelas

See also Example 9-30 and Example A-37.

hexdump

Performs a hexadecimal, octal, decimal, or ASCII dump of a binary file. This command is the rough equivalent of od, above, but not nearly as useful. May be used to view the contents of a binary file, in combination with dd and less.

   1 dd if=/bin/ls | hexdump -C | less
   2 # The -C option nicely formats the output in tabular form.

objdump

Displays information about an object file or binary executable in either hexadecimal form or as a disassembled listing (with the -d option).

 bash$ objdump -d /bin/ls
 /bin/ls:     file format elf32-i386

 Disassembly of section .init:

 080490bc <.init>:
  80490bc:       55                      push   %ebp
  80490bd:       89 e5                   mov    %esp,%ebp
  . . .
 	      

mcookie

This command generates a "magic cookie," a 128-bit (32-character) pseudorandom hexadecimal number, normally used as an authorization "signature" by the X server. This also available for use in a script as a "quick 'n dirty" random number.

   1 random000=$(mcookie)

Of course, a script could use md5 for the same purpose.

   1 # Generate md5 checksum on the script itself.
   2 random001=`md5sum $0 | awk '{print $1}'`
   3 # Uses 'awk' to strip off the filename.

The mcookie command gives yet another way to generate a "unique" filename.


Example 15-57. Filename generator

   1 #!/bin/bash
   2 # tempfile-name.sh:  temp filename generator
   3 
   4 BASE_STR=`mcookie`   # 32-character magic cookie.
   5 POS=11               # Arbitrary position in magic cookie string.
   6 LEN=5                # Get $LEN consecutive characters.
   7 
   8 prefix=temp          #  This is, after all, a "temp" file.
   9                      #  For more "uniqueness," generate the
  10                      #+ filename prefix using the same method
  11                      #+ as the suffix, below.
  12 
  13 suffix=${BASE_STR:POS:LEN}
  14                      #  Extract a 5-character string,
  15                      #+ starting at position 11.
  16 
  17 temp_filename=$prefix.$suffix
  18                      # Construct the filename.
  19 
  20 echo "Temp filename = "$temp_filename""
  21 
  22 # sh tempfile-name.sh
  23 # Temp filename = temp.e19ea
  24 
  25 #  Compare this method of generating "unique" filenames
  26 #+ with the 'date' method in ex51.sh.
  27 
  28 exit 0

units

This utility converts between different units of measure. While normally invoked in interactive mode, units may find use in a script.


Example 15-58. Converting meters to miles

   1 #!/bin/bash
   2 # unit-conversion.sh
   3 
   4 
   5 convert_units ()  # Takes as arguments the units to convert.
   6 {
   7   cf=$(units "$1" "$2" | sed --silent -e '1p' | awk '{print $2}')
   8   # Strip off everything except the actual conversion factor.
   9   echo "$cf"
  10 }  
  11 
  12 Unit1=miles
  13 Unit2=meters
  14 cfactor=`convert_units $Unit1 $Unit2`
  15 quantity=3.73
  16 
  17 result=$(echo $quantity*$cfactor | bc)
  18 
  19 echo "There are $result $Unit2 in $quantity $Unit1."
  20 
  21 #  What happens if you pass incompatible units,
  22 #+ such as "acres" and "miles" to the function?
  23 
  24 exit 0

m4

A hidden treasure, m4 is a powerful macro processing filter, [5] virtually a complete language. Although originally written as a pre-processor for RatFor, m4 turned out to be useful as a stand-alone utility. In fact, m4 combines some of the functionality of eval, tr, and awk, in addition to its extensive macro expansion facilities.

The April, 2002 issue of Linux Journal has a very nice article on m4 and its uses.


Example 15-59. Using m4

   1 #!/bin/bash
   2 # m4.sh: Using the m4 macro processor
   3 
   4 # Strings
   5 string=abcdA01
   6 echo "len($string)" | m4                            #   7
   7 echo "substr($string,4)" | m4                       # A01
   8 echo "regexp($string,[0-1][0-1],\&Z)" | m4          # 01Z
   9 
  10 # Arithmetic
  11 echo "incr(22)" | m4                                #  23
  12 echo "eval(99 / 3)" | m4                            #  33
  13 
  14 exit 0

doexec

The doexec command enables passing an arbitrary list of arguments to a binary executable. In particular, passing argv[0] (which corresponds to $0 in a script) lets the executable be invoked by various names, and it can then carry out different sets of actions, according to the name by which it was called. What this amounts to is roundabout way of passing options to an executable.

For example, the /usr/local/bin directory might contain a binary called "aaa". Invoking doexec /usr/local/bin/aaa list would list all those files in the current working directory beginning with an "a", while invoking (the same executable with) doexec /usr/local/bin/aaa delete would delete those files.

Note

The various behaviors of the executable must be defined within the code of the executable itself, analogous to something like the following in a shell script:
   1 case `basename $0` in
   2 "name1" ) do_something;;
   3 "name2" ) do_something_else;;
   4 "name3" ) do_yet_another_thing;;
   5 *       ) bail_out;;
   6 esac

dialog

The dialog family of tools provide a method of calling interactive "dialog" boxes from a script. The more elaborate variations of dialog -- gdialog, Xdialog, and kdialog -- actually invoke X-Windows widgets. See Example 33-19.

sox

The sox, or "sound exchange" command plays and performs transformations on sound files. In fact, the /usr/bin/play executable (now deprecated) is nothing but a shell wrapper for sox.

For example, sox soundfile.wav soundfile.au changes a WAV sound file into a (Sun audio format) AU sound file.

Shell scripts are ideally suited for batch-processing sox operations on sound files. For examples, see the Linux Radio Timeshift HOWTO and the MP3do Project.

Notes

[1]

This is actually a script adapted from the Debian Linux distribution.

[2]

The print queue is the group of jobs "waiting in line" to be printed.

[3]

For an excellent overview of this topic, see Andy Vaught's article, Introduction to Named Pipes, in the September, 1997 issue of Linux Journal.

[4]

EBCDIC (pronounced "ebb-sid-ick") is an acronym for Extended Binary Coded Decimal Interchange Code. This is an IBM data format no longer in much use. A bizarre application of the conv=ebcdic option of dd is as a quick 'n easy, but not very secure text file encoder.
   1 cat $file | dd conv=swab,ebcdic > $file_encrypted
   2 # Encode (looks like gibberish).		    
   3 # Might as well switch bytes (swab), too, for a little extra obscurity.
   4 
   5 cat $file_encrypted | dd conv=swab,ascii > $file_plaintext
   6 # Decode.

[5]

A macro is a symbolic constant that expands into a command string or a set of operations on parameters.