Skip To Content
Cambridge University Science Magazine


"SUN Microsystems is moving toward making its Java technology fully open-source, a company executive said Tuesday, addressing an audience of programmers at the ApacheCon Europe 2000."

-- Infoworld




News like the above speak for themselves. The open source model, promoted by a geek community since the dawn of computing, is finally going mainstream to revolutionize the software industry. What does it mean, and how did it all happen?

A program being 'open source' basically means it can be copied and modified without restrictions. Modifications require access to the source code, which is usually not available for commercial software. Also, the modified versions must enjoy the same freedom. The distribution of such software is largely thanks to the Internet. Due to the number and enthusiasm of the people involved in open source projects, bugs are found quickly. Fixes are often available within days of bug reports. However, the careful development and review cycle may result in apparently slow progress. But the hackers (i.e. talented programmers who do these projects for fun) are more interested in creating reliable products than releasing on set dates.

The open source community may appear as a barter economy: the members are happy to give away the programs they create for free, in exchange for the wide variety of others' code available. However, it is not a well defined group of people. A hacker may join a certain project simply to fix a problem others could not, and then move on.

The products of open source are free for anyone, even a non-hacker to use. Certainly the 'members' should and do allow this, and hence it is not a pure barter economy. A crucial question is whether people can make their living by hacking open source. An increasing number does, thanks to companies dedicated to the distribution of open source.

At this point I would like to mention that bluesci.com is a pure open source project. We have downloaded an open-source site engine, modified it for our purposes, and will make it freely available. And none of us is making any money with it right now. Basically, we are doing it for fun. And experience, which will be appreciated by many employers.

The prime example of an open source project is the GNU/Linux operating system, started in 1991. The Finnish compsci Linus Torvalds, when using a UNIX system at the university, thought it was superior to the then prevalent MS-DOS / Windows 3 and wanted one for himself. On the basis of a commercial UNIX for PCs called Minix, Torvalds decided to write his own. He announced the idea on an Internet newsgroup and many other hackers joined. It cannot be stressed enough that Linux is the kernel - i.e. the core of the operating system, while the user interfaces and applications are mostly from the GNU project. The original codename for the project was Freax (free freak unix) but fortunately Torvalds got convinced that something less geeky was required if it was to go mainstream.

"Linux represents a best-of-breed UNIX, that is trusted in mission critical applications, and - due to it's open source code - has a long term credibility which exceeds many other competitive OS's."

-- The Microsoft 'Halloween II Document'


Why has Linux not made the breakthrough of desktop domination? From the point of view of the user, it is often the case that 'better is the worst enemy of good enough'. This is a common problem even within a single operating system or application: users might stick to using mouse instead of learning keyboard shortcuts, although the latter would prove faster in the long run.

Stability is an important issue where Linux certainly has an edge over Windows. However, the implications are not obvious. Torvalds himself has remarked that most people do not think of stability at all, because of the misconception that instability is somehow inherent in computers. Through the domination of Windows on workstations, people have got so used to 'crashes' that they do not know they should demand anything better.

But there is a lot more to IT than desktop computers. The maintainers of file and network servers, including those that form the basis of the Internet, are very much interested in stability. Thus, it is no surprise that this is the segment where Linux has gained the most popularity. It is estimated that more than 50% of the WWW servers run Linux. This makes sense from a conceptual point of view, as Linux is a kind of UNIX. The Internet has evolved together with UNIX, from the important milestone of the first email sent in September 1969 to the modern ideas of the Net. In addition to networking, UNIX was designed for multi-user systems, hence it has been essential to develop true multi-tasking.

Compare this with Windows, whose story starts with MS-DOS around 1980. This single user system was overly simplified with respect to UNIX, which eventually made it possible to have a computer at everyone's home. It is worthwhile to appreciate that, without the MS-DOS PC, computers might never have gained such a widespread popularity. Only after Microsoft had brought the computer to people's reach, did bright, young individuals like Torvalds have the chance to start hacking something new.

It would be far too optimistic to assume that the first widely used computer systems were the best ones. This is simply evolution - the idea which recently was grossly misused by Microsoft in an advertisement on a German computing magazine. It portrayed a penguin (the Linux mascot) in various mutated forms, and praised the apparent stability and unity of Windows. Too bad the copywriters had 'forgotten' that is was only through myriads of consecutive mutations and the natural selection that humans evolved from the primordial ooze.

This is not to say that Windows has not undergone evolution as well. However, until the latest versions, Windows has been built on the ancient MS-DOS. Even though it was no longer apparent from versions 95 onwards, the result is a huge amount of scripting and hacking on top of the old kernel. This was changed with Windows NT and other versions based on it, but by that time the interface had become so established that it was impossible to consider anything but the traditional windowing environment, so even NT indirectly suffers from MS-DOS's limitations.

I often try to explain people how Windows is hopelessly limited in comparison to Linux and other Unices. One can only realise it after the transition from one to the other. The main reason UNIX systems are more flexible is their modularity: for example, the kernel and the user interface are completely separate. Sometimes it happens that a complex piece of user-level software crashes, but only very rarely this leads to problems elsewhere in the system. Also, one can choose between different interfaces. This is not just for the geek factor, but because different tasks require different working environments, if they are to be done efficiently.

The amount of choice can be intimidating to the typical Windows user wishing to learn Linux. Often the question is raised, why one would need different user interfaces when one is good enough in Windows. In my opinion, much of the appearance of graphical interfaces serves no real purpose except to attract novice users. For instance, word processing in the UNIX realm is still usually done in text mode. The system allows the writer concentrate on content, while the appearance is determined by ready-made templates. The increase in productivity is obvious. Of course there are graphical interfaces as well for image manipulation and the like.

Given the advantages and the low cost of Linux, it seems a mystery why it is not being more widely used by enterprises. But when the businesses you try to communicate with use Word documents, it is certainly counterproductive for you to switch to Linux. The case with Word documents in particular is interesting because of the version incompatibilities; not only you need Word, you also need the latest version. With so many businesses using Windows, Microsoft need not obey standards. Things would be interesting if there was, say, a 50-50 division of two software companies in the market. This would only be plausible with the use of standard file formats.

This is one of the many facts I love about Linux. The open source community has little choice but to stick to standards: for instance, the documentation for the software is either in plain text or HTML files, which can be viewed on any computer. With the Internet connecting computer users around the globe, the importance of standards is bound to increase.

I could go on ranting about the superiority of open source for ever. But really, the only way to convince yourself is to try it out. You have the choice. And once you're using a free system, you'll have plenty more... :-)

links:





Risto A. Paju is an Undergraduate in Physics at Queens'