As nice as KDE and Gnome are, they use system resources like popcorn. If you are only starting an application, try a desktop that is more lightweight such as Blackbox. Though your distribution should set up the basics for you, you will probably have to edit the configuration files (in this case, the Blackbox menu file that is specified in ~/.blackbox) for each user. Also, make sure your users know how to work the environment. At the very least, teach them that CTRL-ALT-BACKSPACE kills the X server.
But real men and women don't need a graphical user interface (GUI) at all: They use a command shell such as bash. Before X Windows gave us graphics, the Free Software Foundation (FSF) had created the GNU tools that are as rock steady as any piece of software on the planet. They are the heart of every distribution, and without them, there would be no "Linux" system (which is why "GNU/Linux" is the more percise term). If you have no choice but to get by with really weak hardware — we're talking anything down to a 386SX here — you can dump X Windows altogether and get along just fine. Even if you stick to GUIs, some basic knowledge of the shell can help you get far more out of your system.
Think of Linux on the command line as the Willow Rosenberg approach to computers: Whereas GUIs are as spectacular as a punch on the nose by vampire slayer Buffy Summers, even a little knowledge of the shell will let you work nuanced magic of nearly unlimited power with little effort. True fans of the TV series will realize that there is a warning implied here: The power of the shell can become habit forming, if not downright addictive, and you can destroy your whole system with no chance of recovery if you mess things up. Using bash takes you as close to the raw energies of your machine as you can get without using a C compiler, and the danger rises accordingly.
It took Willow six years to become a witch powerful enough to end the world, but it should take you a few weeks at most to become familiar with the command line. Here are four paragraphs to help you decide if you want to make the effort:
The power of the command line environment is rooted in its design philosophy: Each tool is designed to do one job and one job only, but to do that job superbly. Also, almost every tool can be connected to every other tool to create processing chains with just a few commands. Since these tools are (almost) all general purpose, you can solve just about any problem with the right combination. With these same commands, you can write little programs (shell scripts) for everyday tasks. If you look closely at the programs your distributor includes, you will see that a lot of the are in bash. Other script languages such as Python or Perl might be more powerful, but the command line is always included and has far less overhead.
It is learning the individual tools of the CLI that is somewhat daunting. A lot of commands have strange names that don't even pretend to be mnemonic (the pattern scanning tool awk is named for its creators Aho, Kernighan, and Weinberger), only make sense in a historical context (the tape archiving utility tar is now used to distribute compressed files), or look like they are typos (umount instead of "unmount", passwd instead of "password"). There can be dozens of options for each command, and they can be just as cryptic. Because the system was written by hackers in the true sense of the word who wanted the computer to get the job done and not talk about it, the shell normally will not ask you for confirmation, even if you tell it to delete every single file on your hard disk. This is where the end of the world scenario from Buffy comes in.
Once you have mastered the basics of the shell, however, you will get stuff done a lot faster, you will understand jokes such as rm -rf /bin/laden, and you will develop a spring in your step and a glint in your eye. This is why even people who are young enough to have been born after the invention of the mouse develop a tendency to use X Windows merely as a comfortable way to open a lot of terminal windows (either xterm or the less resource-hungry rxvt).
The CLI has just about every tool you'll need: mutt or pine for email (real hard-core basket cases use mail) w3m or lynx for surfing, and of course the legendary editors vi (more commonly vim these days) or emacs. The obvious exception to this rule are programs that let you view pictures. But then you probably aren't interested in that sort of thing anyway, are you.
Basically, you have the same options for text terminals as you do with X terminals. Everything is just a bit easier.
For example, you don't have to reboot if you are forced to use a different operation system: Any program that lets you log in via telnet (on secure, closed networks) or ssh (everywhere else) will do. Microsoft Windows includes a telnet client that is best described as rudimentary; for serious work, try a free Win32 implementation such as Simon Tathamt's PuTTY http://www.chiark.greenend.org.uk/sgtatham/putty/. Apple users with Mac OS X should have no problems with their clients.
The Linux Terminal Server Project also has a package for text terminals. The hardware can be as basic as it gets: Go find yourself a 386DX (for those of you who don't remember the Soviet Union or the first Star Trek series: This is the original Pentium's grandaddy). The mainboard will probably not have a PCI slot, so you'll need an ISA graphics card and an ISA network card. These are so low down the hardware chain you might have problems finding them, because they are being junked, not sold second hand.
There is no reason, though, why your computer has to be advanced enought to understand the TCP/IP protocol and be part of your local network at all. You can connect just about any computer to the serial port(s) of the mock mainframe: For example, there is a Linux HOWTO for older Macs by Robert Kiesling (The MacTerminal MINI-HOWTO); in an article in The Linux Gazette http://www.linuxgazette.com/issue70/arndt.html, Matthias Arndt shows how to convert an Atari ST into a terminal; Nicholas Petreley explains in IT World.com http://www.itworld.com/Comp/2384/LWD010511penguin2/ how to use your Palm Pilot. If you can get it connected to the serial port, chances are you can get it running on Linux. There are special cards with multiple serial ports for larger setups. Of course, there is a HOWTO for that as well: The Serial HOWTO by David S.Lawyer.
You can also get special text terminals as individual machines. David S. Lawyer has written an extensive Linux HOWTO on the subject (Text-Terminal-HOWTO) that explains how they work, how you set them up, and why you would want one.
To get you started on the shell, here are a few commands that are especially useful if you are sharing a system. These very basic examples were chosen to be useful to normal users.
Play nice. The nice command is one of those things that would make the world a better place if everybody used it more often, but nobody does. It allows you to lower the scheduling priority of a process so that less important programs don't get in the way of the important ones.
For example, assume you have a WAV recording of your own voice as you sing a song under the shower, and you want to convert it to the Ogg Vorbis format to distribute to your fans on the Internet, all three of them. A simple command to do this is
oggenc -o showersong.ogg showersong.wav |
Encoding music formats is a CPU intensive process, so performance will drop. Now, if a few minutes more or less don't matter, just start the line off with nice:
nice oggenc -o shower.ogg shower.wav |
Now the encoding will be run with a lower priority, but you will still have to wait for it to finish before you can use the shell again. To have the computer execute a command in the background, add an ampersand ("&") to the end of the line:
nice oggenc -o shower.ogg shower.wav & |
The shell will respond by giving you a job number and a process id (PID), and then will ask you for the next command.
The nice command is a good example of the power that was lost when graphical interfaces became the default: There is no simple way to adjust the priority of a process with a mouse-driven interface.
Do it later. Another way to spread the load is to have an intensive process start at a time when the system is not being used much. Depending on who is on the system with you, this could be three o'clock in the morning or any time until two o'clock in the afternoon.
The at command lets you set a time to start a program or any other process that can be run from the command line. To have our shower song encoded at eight in the evening when you are out watching meaningful French love films, you enter the command "at" followed by the time you want execution to start, and then hit the ENTER. Then you type in the command itself, followed by another ENTER, and finally a CTRL-d to finish the sequence:
me@mycomputer:> at 20:00 warning: commands will be executed using /bin/sh > nice oggenc -o shower.ogg shower.wav > <CTRL-d> job 1 at 2003-09-28 20:00 |
The at command accepts just about any time format: Americans get to use their quaint "08:00pm" notation instead of "20:00", and there are a whole set of shortcuts like midnight, noon or even teatime. at sends the output of the command to your mailbox.
Do it when you are bored. A close relative of at uses system load, not time of day to determine when a command should be run: batch saves the execution for a time when the system load has fallen below a certain value (to see what your system load currently is, run uptime from a shell or xload under X Windows). The documentation gives this value as 0.8. The syntax for batch is basically the same as for at, except that the time field is optional.