Наши партнеры








Книги по Linux (с отзывами читателей)

Библиотека сайта rus-linux.net

[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2. Introduction

Before we get into "cooking" and the recipes proper, this first part of the book deals with preliminaries, explaining the general techniques and methods for working with Linux--including how to get the system ready for use, and how to run commands on the system.

The rest of the book is all recipes, which are sorted in sections by the tasks they perform or the objects they work on--such as text, files, images, and so forth.

2.1 Background and History  Background history.
2.2 What to Try First  What to try first.
2.3 If You Need More Help  If you need more help ...


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1 Background and History

In order to understand what Linux is all about, it helps to know a bit about how it all began. So the following is a historical overview, giving a concise background of the software that is the subject of this book.

2.1.1 What's Unix?  
2.1.2 What's Free Software?  
2.1.3 What's Open Source?  
2.1.4 What's Linux?  
2.1.5 What's Debian?  
2.1.6 Unix and the Tools Philosophy  Unix and the tools philosophy.


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1.1 What's Unix?

@sf{WWW}: http://www.bell-labs.com/history/unix/
@sf{WWW}: http://internet-history.org/archives/early.history.of.unix.html

Unix, the original ancestor of Linux, is an operating system. Or at least it was an operating system; the original system known as Unix proper is not the "Unix" we know and use today; there are now many "flavors" of Unix, of which Linux has become the most popular.

A product of the 1960s, Unix and its related software was invented by Dennis Ritchie, Ken Thompson, Brian Kernighan, and other hackers at Bell Labs in 1969; its name was a play on "Multics," another operating system of the time.(3)

In the early days of Unix, any interested party who had the hardware to run it on could get a tape of the software from Bell Labs, with printed manuals, for a very nominal charge. (This was before the era of personal computing, and in practice, mostly only universities and research laboratories did this). Local sites played with the software's source code, extending and customizing the system to their needs and liking.

Beginning in the late 1970s, computer scientists at the University of California, Berkeley, a licensee of the Unix source code, had been making their own improvements and enhancements to the Unix source during the course of their research, which included the development of TCP/IP networking. Their work became known as the BSD ("Berkeley Systems Distribution") flavor of Unix.

The source code of their work was made publicly available under licensing that permitted redistribution, with source or without, provided that Berkeley was credited for their portions of the code. There are many modern variants of the original BSD still actively developed today, and some of them--such as NetBSD and OpenBSD--can run on personal computers.

NOTE: The uppercase word `UNIX' became a trademark of AT&T (since transferred to other organizations), to mean their particular operating system. But today, when people say "Unix," they usually mean "a Unix-like operating system," a generalization that includes Linux.

If you'd like further information on this topic, you might be interested in consulting A Quarter Century of UNIX by Peter H. Salus (Addison-Wesley 1994), which has become the standard text on the subject.


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1.2 What's Free Software?

@sf{WWW}: http://www.gnu.org/philosophy/free-sw.html

Over the years, Unix's popularity grew. After the divestiture of AT&T, the tapes of the source code that Bell Labs provided became a proprietary, commercial product: AT&T UNIX. But it was expensive, and didn't come with the source code that made it tick. Even if you paid extra for a copy of the sources, you couldn't share with your programmer colleagues any improvements or discoveries you made.

By the early 1980s, proprietary software development, by only-for-profit corporations, was quickly becoming the norm--even at universities. More software was being distributed without source code than ever before.

In 1984, while at the Massachusetts Institute of Technology in Cambridge, Massachusetts, hacker Richard Stallman saw his colleagues gradually accept and move to this proprietary development model. He did not accept the kind of world such proprietism would offer: no sharing your findings with your fellow man, no freedom for anyone to take a look "under the hood" of a published work to see how it worked so that one could understand it or build upon it; it would mean no freedom to improve your copy of such works, or do what you please with your copy--including share it with others.

So instead of giving in to the world of non-free computing, Stallman decided to start a project to build and assemble a new Unix-like operating system from scratch, and make its source code free for anyone to copy and modify. This was the GNU Project ("GNU's Not Unix").(4)

The GNU Project's software would be licensed in such a way so that everyone was given the freedom to copy, distribute, and modify their copy of the software; as a result, this kind of software became known as free software.

Individuals and businesses may charge for free software, but anyone is free to share copies with their neighbors, change it, or look at its source code to see how it works. There are no secrets in free software; it's software that gives all of its users the freedom they deserve.

Proprietary software strictly limits these freedoms--in accordance with copyright law, which was formulated in an age when works were normally set and manipulated in physical form, and not as non-physical data, which is what computers copy and modify.

Free software licensing was developed as a way to work around the failings of copyright law, by permitting anyone to copy and modify a work, though under certain strict terms and conditions. The GNU Project's GNU General Public License, or GNU GPL, is the most widely used of all free software licenses. Popularly called a "copyleft," it permits anyone to copy or modify any software released under its terms--provided all derivatives or modifications are released under the same terms, and all changes are documented.


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1.3 What's Open Source?

@sf{WWW}: http://www.opensource.org/
@sf{WWW}: http://www.gnu.org/philosophy/free-software-for-freedom.html

The term open source was first introduced by some free software hackers in 1998 to be a marketing term for "free software." They felt that some people unfamiliar with the free software movement--namely, large corporations, who'd suddenly taken an interest in the more than ten years' worth of work that had been put into it--might be scared by the word "free." They were concerned that decision-makers in these corporations might confuse free software with things like freeware, which is software provided free of charge, and in executable form only. (Free software means nothing of the sort, of course; the "free" in "free software" has always referred to freedom, not price.)

The Open Source Initiative (OSI) was founded to promote software that conforms with their public "Open Source Definition," which was derived from the "Debian Free Software Guidelines" (DFSG), originally written by Bruce Perens as a set of software inclusion guidelines for Debian. All free software--including software released under the terms of the GNU General Public License--conforms with this definition.

But some free software advocates and organizations, including the GNU Project, do not endorse the term "open source" at all, believing that it obscures the importance of "freedom" in this movement.(5)

Whether you call it free software, open source software, or something else, there is one fundamental difference between this kind of software and proprietary, non-free software--and that is that free software always ensures that everyone is granted certain fundamental freedoms with respect to that software.


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1.4 What's Linux?

In the early 1990s, Finnish computer science student Linus Torvalds began hacking on Minix, a small, Unix-like operating system for personal computers then used in college operating systems courses.(6) He decided to improve the main software component underlying Minix, called the kernel, by writing his own. (The kernel is the central component of any Unix-like operating system.)

In late 1991, Torvalds published the first version of this kernel on the Internet, calling it "Linux" (a play on both Minix and his own name).(7)

When Torvalds published Linux, he used the copyleft software license published by the GNU Project, the GNU General Public License. Doing so made his software free to use, copy, and modify by anyone--provided any copies or variations were kept equally free. Torvalds also invited contributions by other programmers, and these contributions came; slowly at first but, as the Internet grew, thousands of hackers and programmers from around the globe contributed to his free software project. The Linux software was immensely extended and improved so that the Linux-based system of today is a complete, modern operating system, which can be used by programmers and non-programmers alike; hence this book.


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1.5 What's Debian?

@sf{WWW}: http://debian.org/

It takes more than individual software programs to make something that we can use on our computers--someone has to put it all together. It takes time to assemble the pieces into a cohesive, usable collection, and test it all, and then keep up to date with the new developments of each piece of software (a small change in any one of which may introduce a new software dependency problem or conflict with the rest). A Linux distribution is such an assemblage. You can do it yourself, of course, and "roll your own" distribution--since it's all free software, anyone can add to it or remove from it and call the resulting concoction their own. Most people, however, choose to leave the distribution business to the experts.

For the purposes of this book, I will assume that you are using the Debian GNU/Linux distribution, which, of all the major distributions, is the only one designed and assembled in the same manner that the Linux kernel and most other free software is written--by individuals.

And when I say "Linux" anywhere in this book (including in the title), unless noted, I am not referring to the bare kernel itself, but to the entire working free software system as a whole. Some people call this "GNU/Linux."(8)

There are many other distributions, and some of them are quite acceptable--many users swear by Red Hat Linux, for example, which is certainly popular, and reportedly easy to install. The SuSE distribution is very well-received in Europe. So when people speak of Debian, Red Hat, SuSE, and the like in terms of Linux, they're talking about the specific distribution of Linux and related software, as assembled and repackaged by these companies or organizations (see section Linux Resources on the Web). The core of the distributions are the same--they're all the Linux kernel, the GNU Project software, and various other free software--but each distribution has its own packaging schemes, defaults, and configuration methods. It is by no means wrong to install and use any of these other distributions, and every recipe in this book should work with all of them (with the exception of variations that are specific to Debian systems, and are labelled as such in the text).

In Debian's early days, it was referred to as the "hacker's distro," because it could be very difficult for a newbie to install and manage. However, that has changed--any Linux newbie can install and use today's Debian painlessly.

NOTE: I recommend Debian because it is non-corporate, openly developed, robust (the standard Debian CD-ROM set comes with more than 2,500 different software packages!), and it is entirely committed to free software by design (yes, there are distributions which are not).


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.1.6 Unix and the Tools Philosophy

@sf{WWW}: http://cm.bell-labs.com/cm/cs/upe/
@sf{WWW}: http://www.cs.bell-labs.com/cm/cs/pearls/

To understand the way tasks are performed on Linux, some discussion on the philosophy behind the software that Linux is built upon is in order. A dip in these inviting waters will help clarify the rôle of this book as "cookbook."

The fact that the Unix operating system has survived for more than thirty years should tell us something about the temerity of its design considerations. One of these considerations--perhaps its most endearing--is the "tools" philosophy.

Most operating systems are designed with a concept of files, come with a set of utility programs for handling these files, and then leave it to the large applications to do the interesting work: a word processor, a spreadsheet, a presentation designer, a Web browser. (When a few of these applications recognize each other's file formats, or share a common interface, the group of applications is called a "suite.")

Each of these monolithic applications presumably has an "open file" command to read a file from disk and open it in the application; most of them, too, come with commands for searching and replacing text, checking spelling, printing the current document, and so on. The program source code for handling all of these tasks must be accounted for separately, inside each application--taking up extra space both in memory and on disk. This is the anti-Unix approach.

And in the case of proprietary software, all of the actual program source code is kept from the public--so other programmers can't use, build on, or learn from any of it. This kind of closed-source software is presented to the world as a kind of magic trick: if you buy a copy of the program, you may use it, but you can never learn how the program actually works.

The result of this is that the code to handle essentially the same function inside all of these different applications must be developed by programmers from scratch, separately and independently of the others each time--so the progress of society as a whole is set back by the countless man-hours of time and energy programmers must waste by inefficiently reinventing all the same software functions to perform the same tasks, over and over again.

Unix-like operating systems don't put so much weight on application programs. Instead, they come with many small programs called tools. Each tool is generally capable of performing a very simple, specific task, and performing it well--one tool does nothing but output the file(s) or data passed to it, one tool spools its input to the print queue, one tool sorts the lines of its input, and so on.

An important early development in Unix was the invention of "pipes," a way to pass the output of one tool to the input of another. By knowing what the individual tools do and how they are combined, a user could now build powerful "strings" of commands.

Just as the tensile strength of steel is greater than the added strength of its components--nickel, cadmium, and iron--multiple tools could then be combined to perform a task unpredicted by the function of the individual tools. This is the concept of synergy, and it forms the basis of the Unix tools philosophy.(9)

Here's an example, using two tools. The first tool, called who, outputs a list of users currently logged on to the system (see section Listing Who Is on the System). The second tool is called wc, which stands for "word count"; it outputs a count of the number of words (or lines or characters) of the input you give it (see section Counting Text).

By combining these two tools, giving the wc command the output of who, you can build a new command to list the number of users currently on the system:

 
$ who | wc -l RET
        4
$

The output of who is piped--via a "pipeline," specified by the vertical bar (`|') character--to the input of wc, which through use of the `-l' option outputs the number of lines of its input.

In this example, the number 4 is shown, indicating that four users are currently logged on the system. (Incidentally, piping the output of who to wc in this fashion is a classic tools example, and was called "the most quoted pipe in the world" by Andrew Walker in The UNIX Environment, a book that was published in 1984.)

Another famous pipeline from the days before spell-check tools goes something like this:

 
$ tr -cs A-Za-z '\012' | tr A-Z a-z | sort -u | 
comm -23 - /usr/dict/words RET

This command (typed all on one line) uses the tr, sort, and comm tools to make a spelling checker--after you type this command, the lines of text you type (until you interrupt it) are converted to a single-column list of lowercase words with two calls of tr, sorted in alphabetical order while ferreting out all duplicates, the resultant list which is then compared with `/usr/dict/words', which is the system "dictionary," a list of properly-spelled words kept in alphabetical order (see section Spelling).

Collective sets of tools designed around a certain kind of field or concept were called "workbenches" on older Unix systems; for example, the tools for checking the spelling, writing style and grammar of their text input were part of the "Writer's Workbench" package (see section Checking Grammar).

Today the GNU Project publishes collections of tools under certain general themes, such as the "GNU text utilities" and "GNU file utilities," but the idea of "workbenches" is generally not part of the idiom of today's Unix-based systems. Needless to say, we still use all kinds of tools for all kinds of purposes; the great bulk of this book details various combinations of tools to obtain the desired results for various common tasks.

You'll find that there's usually one tool or command sequence that works perfectly for a given task, but sometimes a satisfactory or even identical result can be had by different combinations of different tools--especially at the hands of a Unix expert. (Traditionally, such an expert was called a wizard.)

Some tasks require more than one tool or command sequence. And yes, there are tasks that require more than what these simple craft or hand tools can provide. Some tasks need more industrial production techniques, which are currently provided for by the application programs. So we still haven't avoided applications entirely; at the turn of the millennium, Linux-based systems still have them, from editors to browsers. But our applications use open file formats, and we can use all of our tools on these data files.

The invention of new tools has been on the rise along with the increased popularity of Linux-based systems. At the time of this writing, there were a total of 1,190 tools in the two primary tool directories (`/bin' and `/usr/bin') on my Linux system. These tools, combined with necessary applications, make free, open source software--for perhaps the first time in its history--a complete, robust system for general use.


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.2 What to Try First

The first four chapters of this book contain all of the introductory matter you need to begin working with Linux. These are the basics.

Beginning Linux users should start with the concepts described in these first chapters. Once you've learned how to start power to the system and log in, you should look over the chapter on the shell, so that you are familiar with typing at the command prompt, and then read the chapter on the graphical windows interface called the X Window System, so that you can start X and run programs from there if you like.

If you are a Linux beginner and are anxious to get up to speed, you might want to skip ahead and read the chapter on files and directories next, to get a sense of what the system looks like and how to maneuver through it. Then, go on to learning how to view text, and how to edit it in an editor (respectively described in the chapters on viewing text and text editing). After this, explore the rest of the book as your needs and interests dictate.

So, to recapitulate, here is what I consider to be the essential material to absorb for familiarizing yourself with the basic usage of a Linux system:

  1. Introduction (this current chapter).

  2. What Every Linux User Knows.

  3. The Shell (ignoring the section on customization for now).

  4. The X Window System (ignoring the section on configuration for now).

  5. Files and Directories.

  6. Viewing Text (mostly the first section, Perusing Text).

  7. Text Editing (enough to select a text editor and begin using it).

If you have a question about a tool or application in particular, look it up in the program index (see section Program Index). The index proper, listing recipe names and the general concepts involved, is called the concept index (see section Concept Index).


[ < ] [ > ]   [ << ] [ Up ] [ >> ]         [Top] [Contents] [Index] [ ? ]

2.3 If You Need More Help

If you need more help than this book can give, remember that you do have other options. Try these steps for getting help:

  • Chances are good that you are not alone in your question, and that someone else has asked it before; therefore, the compendiums of "Frequently Asked Questions" just might have the answer you need: the Debian FAQ and the Linux FAQ.

  • The Linux Documentation Project is the center of the most complete and up-to-date Linux-related documentation available; see if there is a document related to the topic you need help with.

  • The Usenet newsgroups news:comp.os.linux.help and news:linux.debian.user are often an excellent place to discuss issues with other Linux users. (Usenet is described in Reading Usenet).

  • Check http://linux.com/lug/ to find the Linux User Group ("LUG") nearest you--people involved with LUGs can be great sources of hands-on help, and it can be fun and rewarding to get involved with other Linux and free software enthusiasts in your local area.

  • Finally, you can hire a consultant. This may be a good option if you need work done right away and are willing to pay for it.

    The Linux Consultants HOWTO is a list of consultants around the world who provide various support services for Linux and open source software in general (see section Reading System Documentation and Help Files). Consultants have various interests and areas of expertise, and they are listed in that document with contact information.


[ << ] [ >> ]           [Top] [Contents] [Index] [ ? ]