Third party cookies may be stored when visiting this site. Please see the cookie information.

PenguinTutor YouTube Channel

Linux command basics reference guide

Whilst many people that use Linux will use a graphical screen, there are times when it is very useful to use the command line tools which could be for convenience (by automating tedious tasks), to provide additional options or when managing a computer remotely. These commands are run from the Linux shell.

This reference guide explains some of the basic commands, and how they can be combined to work together. Here the words command and program are used interchangeably. For this reference guide I am referring to command line programs which take a given input and return information after the appropriate processing. This is different from full applications which could be graphical or provide a more fully featured text display, or require user interaction whilst running.

Brief Command Names

UNIX commands are not necessarily the easiest commands to remember. They are designed to be short commands to reduce the amount of typing:

eg.
ls - List directory contents
cd - Change Directory
cp - Copy
pg - Show output one Page at a time more - Similar to pg less - Like more, but with more features!

This can make it a little hard to remember but does save on typing when entering a large number of commands.

The general UNIX philosophy is for command to only perform their own specific functions, and to do them well, but not cover the functionality offered by other commands. For example rather than every program having to implement a sort function, most commands will output unsorted output and if necessary the output can be passed to the sort command whose task is just to sort the data into a specific order.

Another example of this in action is that many commands will output to the screen without worrying about how much data is being given. Used alone a command giving a large amount of data will cause some of the information to scroll off the screen. To view the information a page at a time the output is piped through the pg command, which handles the paging. Alternatively the output could be piped through the sort command (to sort into order), or the head or tail commands (to show the first or last few lines).

The following example shows the ls command which is piped through the more command so that one screen-full is shown at a time.

ls | more

This will be discussed in more detail later.

Format of Commands

Commands designed to run from the command line will normally have some options that can be used to change the way the command works. A typical command may have multiple options to change the way it acts on the data, and may take other arguments such as filenames, or inputed text.

command option(s) argument(s)

An example of this would be the ls command. The ls command will be explained separately; for now it is sufficient to know that the ls command will list the contents of a directory and will accept certain options and arguments (similar to the dir command in DOS/Windows). One option to the ls command is the "-l" option which means provide more details about the files and another is "-a" which shows all files including hidden files. The argument provided is a file or directory name.

ls -l /home/stewart

will show the contents of my home directory. The -l is an option in that it changes to way the program runs and the argument is /home/stewart which tells the program what directory to look in. The ls command doesn't require any options or arguments. For example the following are also perfectly valid commands.

ls /home/stewart         shows a brief listing of the directory
ls -l         shows a full listing of the current directory
ls         shows a brief listing of the current directory

If more than one option is required there are two ways of specifying them. Either individually as separate options ie.

ls -l -a /home/stewart

or combined ie.

ls -la /home/stewart

both the above will give the same output.

Making the most of UNIX commands

Whilst the number of options on each UNIX command may seam overwhelming at first this is part of what makes UNIX so powerful. Another of the features that makes UNIX so powerful is the ability to combine several commands to make them more useful. This can be achieved either by stringing commands together on the command line or by bundling the commands together into a script file which can range from something very trivial to a program in it's own right.

Using command switches

The most basic way of extending the functionality of a command is to try some of the switches available.

This uses the ls command again to show the different output.

$ ls docs readme.txt file1.txt file2.txt

By adding the '-l' option more information is provided.

$ ls -l                                                          

total 6177 

drwx------   2 stewart  users        512 Sep 16 17:42 docs

-rw-------   1 stewart  users        124 Sep 16 17:26 readme.txt

-rw-------   1 stewart  users    3156558 Sep 16 17:25 file1.txt

-rw-------   1 stewart  users       3505 Sep 16 17:25 file2.txt

Pipelines (|)

As mentioned earlier we often want to take the output of one command and pass it on to another. The standard way of performing this is using a pipeline, referred to as pipe, or pipeing the output.

The pipe is a vertical bar '|'. On a standard UK keyboard this is normally found at the bottom left of the keyboard and is typed by using shift and the '\' key. On a US keyboard this shares the same '\' key, but is sometimes located above the RETURN key. On other European keyboards this may be on one of the number keys (e.g. Alt-Gr 6).

The first command to run is listed first followed by the pipe symbol and then followed by the second command. Any output from the first command is then used as input to the second command.

For example to sort a basic directory listing by name the ls command is piped through the sort command.

ls | sort

The output can be passed through a number of commands by using a pipe through each one. The full command string is referred to as a pipeline.

ls | sort | more

Redirecting stdout, stdin and stderr (> <)

Unless a command is piped into another the output normally goes to the standard output (stdout) which is normally the screen. The input is normally taken from standard input (stdin) which is normally the keyboard. To automate processes it is sometimes required to change this so that for example the output from a command is sent to a file or the printer, or the input to a command is taken from a file. This is done by redirecting the stdout and stdin streams.

Redirecting Standard Output (stdout) >

The output from the ls command could be redirected to a file in this case called dirlist.txt

ls > dirlist.txt

If the file dirlist.txt already exists it will be deleted. It is also possible to append the output to the end of an existing file by using >> instead of >.

For example

echo "This is the next line of the log" >> log.file

Here any existing content will remain, but the text "This is the next line of the log" will be added to the bottom of the file.

Redirecting Standard Error (stderr) 2>

Whilst you may see all the output from a command on a single screen this is not all neccessarily coming from stdout. There is also another data stream called standard error which by default is directed to the same screen as stdout. This data stream is used to send messages regarding any error messages. The advantage of having errors as a separate stream is that even if you redirect stdout to a file you will instantly see any error messages on the screen.

If the command is running automatically without user interaction then there may not be any one to see messages put on the screen. The standard error data stream can therefore be redirected in a similar way to stdout by prefixing the redirect by the number 2 digit. In fact the stdout data stream should be prefixed by the number 1 digit however this is dropped to save typing. To redirect any error messages to an error.log file and the normal responses to a log file the following would be used.

command >log.file 2>error.log

The single greater-than (>) can be replaced by double greater-than symbol (>>) if you would like the output to be appended to the file rather than to overwrite the file.

It is also possible to write both stdout and the standard error stream to the same file. This is not, as you might expect, simply a case of using the same file name in the above command. The reason for this is that a file can only be opened for writing by one process at a time. The two redirects are two different processes and would not allow both streams to write to the same file. This can be achieved by redirecting the error data stream to the stdout data stream using 2>&1. Which now gives:

command >output.file 2>&1

Using a Temporary File

A similar issue to the problem of trying to write the a file twice is that you cannot use a file used as input as an output file. For example it is not valid to issue the following command

sort file1 >file1         This is not valid

instead the output would have to be redirected to a temporary file and then renamed to the required name.

sort file1 >/temp/tmp$$
mv /tmp/tmp$$ file1

The file ending in $$ will actually be created by the system with a unique number. This is useful for temporary files as it prevents you overwriting a temporary file in use by a different process.

Output to a File and the Command Line (tee)

The earlier redirects are normally adequate for most purposes, however sometimes it is necessary for someone to monitor the output of the command but also for it to be duplicated into a file for logging purposes or to perform further processing on.

The tee command is used within a pipeline and whatever input it receives it both puts outputs into a file and forwards on the pipeline.

command | tee file1

The line above will take any output from the command and put a copy in file1 as well as sending it to the screen. This could be combined further by allowing the output to be further processed by another command. The tee command can be used any number of times in a pipeline like this.

command1 | tee file1 | command2

If you want tee to append to a file rather than overwrite it the -a option is used.

Redirecting Standard In (stdin) <

The same basic redirect can also be done in the reverse direction in that an interactive program that requires input from a user can be automated. For example with an interactive program such as ftp (file transfer protocol). The ftp program allows files to be transferred from one computer to another over a network. This however needs a user to type the commands in to transfer the file. Instead the commands chould be entered into a text file the same as how they would be entered from the keyboard. The file is then directed into the program in place of the stdin.

ftp linux2.penguintutor.com <commands.txt

(in fact there are alternative ways of doing specific ftp transfers which don't require an interactive input, such as using nftp, lftp or in some circumanstances sitecopy).

No Clobber

Redirecting to a file can have an unfortunate consequence if the file already exists and does not want to be replaced. By using the redirects incorrectly it is possible to accidentally overwrite an important file. In the bash/korn shell there is an option that allows us to prevent overwriting files by mistake. This is the noclobber option and is set by typing

set -o noclobber

If an attempt is now made to overwrite a file the shell will issue an error message and prevent the file being written to. The no clobber option can be turned off using

set + noclobber

If required the noclobber option could be put an a users .profile to have this set automatically.

File Descriptor Table

The use of stdin, stdout and stderr is possible using only the single less than / greater than signs because of the way that processes are assigned to a file descriptor table.

The file descriptor table is a list of numbers relating to open files. The first 3 files to be opened are stdin, stdout and stderr, these are numbered 0 for stdin, 1 for stdout and 2 for stderr. Therefore stdin and stdout can be referred to by < and > respectively (no further filename) whereas stderr requires 2> to ensure it is output stream numbered 2 that is to be redirected.

Further Reading

Bash Reference Manual

Previous Linux documentation and help reference guide
Linux documentation and help reference guide
Next Linux useful commands
Linux useful commands