Open Source--The Unauthorized White Papers Chapter 13, Companies and Projects

Open Source: The Unauthorized White Papers

Chapter 13

Companies and Projects

This final part of the book does not try to be predictive — Open Source happens too quickly — but it does point out some themes you should watch over the next year. In Chapter 1 we looked at business and social forces feeding into Open Source; this chapter will look more closely at some of them as they interact.

W.R. Hambrecht + Co. ( estimates the size of the Open Source market in 2000 at nearly two billion dollars, and predicts a size of eleven billion dollars in 2003. This market’s estimates include hardware, software, services, and even on-line advertising, but omits the embedded market.

Open Source Opportunities

Open Source is bringing the developer and the user closer together. As originally conceived, Free Software’s users were developers: users compiled the source code themselves, adding tweaks along the way. But all across the business spectrum there is now a tendency for middlemen to be pushed aside, for layers of distribution to be thinned and flattened, and for customers and producers to face each other more directly. The ultimate step is for the customer to buy everything from the manufacturer, but the intermediate step of being the sole middleman is most attractive and lucrative to businesses that can figure out how to take this space. A vertical market, for instance, can be run by the customer (like the auto-manufacturer consortium that GM is setting up to buy from suppliers), or it can be run by a third party in the event that the market is fragmented so that no one or three buyers dominate it. The middleman’s essential position in these markets is as supplier of information and guarantor of transactions (something the auction sites are aspiring to).

The Internet makes buyers better able to gather and share information among themselves. Discussion lists and consumer opinion sites are established means. Now new technologies enable users to privately mark up Web sites with their own comments and observations, so that other users of the technology can read the remarks, while the actual Web page under discussion remains unmarked. The Foresight Institute ( provides Web site mark-up software as CritSuite (, while a commercial site provides its own technology, Third Voice ( The ever-present voice of Web criticism has set up a more traditional form of on-line critique at a counter site, Say No to Third Voice ( Both types of commentary are upsetting to traditional markets and marketers. In an unpublished article, Sam Byassee of Smith, Helms, Mullis & Moore argues that the insertion of HTML code into the display stream of a Web site is a violation of the copyright owner’s right to display, one of the bundle of rights set up in the Copyright Act of 1976.

Whatever the courts may ultimately say on the subject, these protests and commentary are simply an outgrowth of software developers’ on-line conversations that have evolved over the years. When software tool companies learned years ago that developers had set up on-line discussion lists to compare notes on particular tools they were using, the smarter vendors learned to send a diplomatic engineer over to the site to deal with problems and the inevitable flaming complaints. Before long the vendors provided their own discussion sites, and also offered on-line help and patches for the products. These practices were translated to the Web, at first for the developer community, and then for customers of any product whose vendor chose to recognize a new way of dealing with customers and turning them into a loyal group.

Recognizing customer needs and building communities work just as well if you are trying to build a business in the Open Source world. The product need not even be Open Source, just tied to that market. Someone thought it would be cool if Linux folks could carry a credit card with a penguin on it; an entrepreneur just out of college picked up the idea and with minimal cash and maximum persistence created the Linux Fund (, which issues the free cards to credit-qualified Penguinistas. From the issuing bank’s perspective, the demographic of single young males with high-paying jobs is attractive. The Fund is set up as a nonprofit, and selects Open Source software projects to fund with its surplus cash derived from a share in the merchant transaction fees.

Another example of a business with modest origins that is capitalizing on the Open Source community is the Web comic strip User Friendly ( It is by a geek and for geeks, and visited often by a loyal following that has come to be called the Ufie community. The growth of User Friendly shows the value of listening to customers: they asked for a book of the cartoons, and got it. They asked for caps and T-shirts and stuffed dolls, and User Friendly has supplied them. The cartoon has its own site now, sells on-line advertising, and continues to cultivate its geek community by providing an employment site and an advisor for the lovelorn.

Besides starting businesses from scratch, existing business can find a way of serving Open Source community needs. Tucows ( began as a Web site for the distribution of Windows software for Web connectivity. As the Web became an easier place to deal with technically, Tucows added other Windows software, Macintosh software, and more recently Linux software; it has also purchased the Linux Weekly News (

All of these examples are part of the growing service sector in our economy. Open Source businesses themselves are service business, or, more exactly, based on servicing relationships. Just as VA Linux is expanding beyond the hardware business (where margins are traditionally low), Red Hat is expanding beyond software and software support services into new service territories.

World Domination by Whom?

Whatever the origins of the Open Source movement discussed in Part I of this book, no one should believe that the commercialization of Open Source software is a giveaway program. The successful companies take a pragmatic approach to Open Source, just as the Open Source movement itself tempers the ideology of the Free Software movement by attempting to reconcile Free Software goals with proprietary software. Large companies still clash in the software industry (we will look at IBM and Sun in a moment); Open Source has added new players to the struggle while giving the old players new resources.

Red Hat vs. Sun Microsystems

Red Hat is in a particularly visible position in the Open Source world. It can’t take credit for being the most profitable Open Source company, since it is currently not profitable at all, like many a young high-tech company. It is, however, successful at what it does, even to the point of stirring murmurs in the Open Source community about how commercial it is becoming. Part of this murmuring is the natural ill feeling that success attracts, and part of it is a recognition that Red Hat is now a public company with the burden of justifying its enormous market capitalization. Red Hat built its success on the cultivation of good relationships with developers; now it must cultivate good relationships with corporate customers, the only source of sufficient revenue to satisfy the expectations of its investors.

Red Hat is not discouraged that the Linux desktop is currently not so functional as the Windows desktop; it has said for years that it is in the server, not the desktop, business. Bob Young has displayed in his office for years a nineteenth-century still-life of the sort that once decorated dining rooms: a dead turkey suspended by its feet. The picture bears a brass label: Microsoft. Red Hat does not intend to attack Microsoft directly on the desktop, but to flank it by moving into a new space that will combine servers, close customer relationships, and large corporate customers: the emerging business of distributed computing.

From the beginning, the natural strength of Linux has been its networking strength (borrowed from BSD, to be fair); Apache and Samba are currently the Linux killer applications. Linux enables many different sorts of systems to talk to each other. Embedded devices, particularly over wireless connections, will be an important extension of this network. The acquisition of Cygnus Solutions gives Red Hat an excellent position in the embedded market. Large corporations want to serve their data not only to remote users (the people on the small devices, such as palmtops), but also to in-house users who have over-the-wire network connections. This latter group will receive not only data, but also many of the applications they use, from servers. It is easier to keep software in tune on a server than on a desktop, and the constant service of Windows desktop systems has been enough of a drain on corporate IT that there is a growing market for application servers. Desktop PCs won’t have to be constantly upgraded to keep up with growing software demands.

Red Hat will enter this world not so much as a server to desktops, although the interconnecting powers of Linux make this possible, but as a server to far-flung embedded devices — and not just as a vendor of software for those servers, but as a complete service to run the servers and host the data and the applications. Corporations will be willing to pay a firm like Red Hat to provide these services as a turnkey solution rather than try to find the internal resources to do so, and will necessarily enter into close, long-term, and lucrative relationships. At that point, Microsoft becomes irrelevant. Even if, as is constantly rumored, Microsoft is porting its proprietary technologies to UNIX, the arrival of Microsoft’s Office suite on the Linux desktop would not move it any closer to the lead that Linux is developing in the embedded field. Microsoft is developing what it calls its Next Generation Windows Services, based on the extension of Windows APIs to the Web, but customers may not wish to repeat on the Web and in Web devices a lock-in they have already experienced on the desktop.

Red Hat may be bypassing Microsoft, but in the Web services space they are coming up against the people who think they invented the idea some years back, when they promoted it as "The Network is the Computer." Sun Microsystems has long thought that the remote serving of applications and data over the Internet was natural for them as a UNIX company, and their plans for revenue from the recently purchased StarOffice depend on an application server model. As for Linux, the upstart has already kicked Sun’s shins by eroding their market for Solaris on Intel, and Sun has never come to terms with the idea that it could adopt Linux for its low end while continuing to sell the more powerful Solaris at the high end. Sun may eventually come to this position; they have been feeling their way toward Open Source, and recently released the source code to the entry-level product of the Forte IDE suite under a Mozilla-like license.


If Red Hat is skirting Microsoft and confronting Sun, it has no current problems with IBM, which is actually helping to build a bigger market for Linux. IBM has this advantage over Red Hat: if some see Red Hat as a Linux Community member who is growing too corporate for their tastes, IBM is a former Microsoft, now tamed by the market and welcomed because it is rapidly learning the lessons of Open Source. IBM is cultivating Open Source developers; beyond the obvious step of hiring platoons of them, it has set up Open Source and Linux Zones at its developer Works Web site (, and aggressively seeks to make it an Open Source developers’ portal.

IBM is careful to be distro-neutral. Besides winning goodwill from all sides of the Community, this action results from IBM’s natural advantage gained from not putting out its own distro: IBM does not have to publish and maintain software that others are perfectly willing to provide. Because there is no reason other than convenience (or to demonstrate that an IBM system will run with a fresh-off-the-shelf boxed distribution), IBM need not purchase more than one copy of GPL-licensed software. IBM can instead concentrate on consulting services and the server and storage business.

Can Hardware Fork the Kernel?

As a hardware vendor IBM, like SGI, could choose to fork the Linux kernel in the interests of optimization for its own hardware. IBM would then have to follow the GNU GPL and ship the source code for these changes with the hardware. This GPL requirement means that third-party software vendors will always have a chance to make their applications run on such a system, preventing customer lock-in. The important decision for a hardware company is whether to make changes that Torvalds refuses to take into the kernel, thus forcing the company to support them on its own. IBM has its own version of Linux optimized for its S/390 mainframe, and for its NUMA-Q server, which clusters 64 Intel chips. IBM has more incentive to have its improvements accepted into the Linux kernel than to issue them as a distro because for IBM the software is only the means to selling the hardware. The difference from the UNIX wars is that Open Source increases the value of the product for the customer.

Hardware vendors accept the burden of code maintenance when they tweak Open Source software to run better on their hardware. As long as the changes do not ripple upward to interfere with the running of applications, they do not really constitute forks.

IBM began in Open Source by using Apache to open up its legacy hardware to modern connections. Linux became a means to free older applications from their isolation on older hardware (DB2 now runs on Linux) and to put their data in contemporary formats. The liberating of the software likewise means that hardware designs and choices can be more flexible.

IBM had a further purpose in backing Open Source Apache: confrontation with and defeat of Microsoft’s mixing its proprietary materials with Web standards, such as Microsoft’s HTTPNG (for "New Generation"). IBM did not want an important part of its strategy to end up under control of another company, but preferred to base its Web strategy on open Web standards. Apache contains three components targeted by Microsoft: HTTP server, Java servlets, and XML. IBM helped Apache strengthen its HTTP server implementation to standard, and worked with Sun on improving JServ. The result was to make Apache much stronger and to lead on to the Jakarta project (

IBM worked on strengthening Apache’s XML as well, and then built Apache into its WebSphere e-commerce product. By offering WebSphere to important commercial e-commerce players such as Ariba (, IBM was able to blunt Microsoft’s BizTalk sales. The fact that Apache runs on Windows NT was important to IBM’s plans, and IBM’s offer of help in improving Apache’s performance on NT was important to the loosely organized Apache Group. Open Source thus became a primary means of moving IBM — both software and hardware — forward, and the commercial firm and Open Source both found mutual profit in the arrangement. Non-Microsoft cross-platform strategies, whether Open Source or Java, are key to IBM’s strategy not to be pulled into the Microsoft orbit as Windows tries to work its way upward in the computing world.

Standards in the non-Linux world

The war between Microsoft and Everyone Else is a traditional computer business war about de facto standards. Together, the Wintel duopoly, Windows and Intel, have an enormous desktop (and a significant server) market share. The main survivor of the UNIX wars is Sun Microsystems, a hardware company that sells its own SPARC chips running the proprietary Solaris operating software. We have already seen that Linux, in a pattern once practiced by Sun itself, is attacking the Sun market from the bottom by winning the Intel-based Solaris market from Sun on price and performance, and by providing more hardware drivers on Intel systems than Sun does. Sun can tolerate this intrusion, and is even adjusting Solaris so that Linux applications can run on it. But Sun is facing a number of limitations, most of them of its own making.

The SPARC chip is no longer the powerhouse attraction it once was; the Solaris operating system is. People are paying premium prices for Sun hardware to run Solaris. This should be a warning to Sun that it has already made the transition from hardware company to software company. Besides climbing into the bottom of the Sun market, Linux is openly trying to move up the scale to rival Solaris. If Sun were to defend its position as a hardware company and simply adopt Linux as its operating system, then it would suffer at the high end because Linux cannot match Solaris there. This inability to take over Linux entirely is the basis of the long-term defense by Sun, which will consist simply of improving Solaris at a pace that it hopes Linux can never reach. Nor need Sun share the secrets of its SPARC chip with Linux developers as Intel has with its Itanium chip. It is Sun’s gamble that these strategies will work, and one of the things to watch over the next few years is the success of this strategy. At the end of those few years Sun may be facing the very large machines of IBM; it will be interesting to see whether those machines will still be running AIX, or whether IBM will have promoted Linux up the ladder.

In the meantime, Sun, which has worked to ensure compatibility between Solaris and the third-party products that run on it, has taken measures to make sure that Linux programs will likewise run smoothly on Solaris. The effect of this measure will be to make software vendors indifferent as to whether their programs are running on Linux or not, and to preserve a market for the SPARC/Solaris machines that might otherwise be running Linux, or, more likely, be put aside in favor of Intel machines for running Linux.

Sun, Java, and Microsoft

Rather than rely on merely defensive measures in its niche, Sun has long been brooding its solution to the Microsoft problem: Java. Like Windows, it is a proprietary standard that Sun hopes to make into a de facto standard. This is classic computer marketing. As a countermeasure, Microsoft tweaked its licensed version of Java into a Microsoft-dependent implementation, and later dropped it in favor of the current plan of trying to implement Microsoft-proprietary Internet standards that may become de facto standards, particularly when supported by an 80 percent share of the Web browser market.

Sun, with its similarly proprietary attitude towards Java, has never dared to submit it to a standards body or to release it under an Open Source license. One official reason given is that Sun’s control prevents any forking of Java, a caution with some justification, for Sun knows that Microsoft will attempt to take over any standards body in control of the Java standard, and if given a free hand with the source code would return to its previous stance and mix Java freely with Microsoft technology. Sun would rather see Java become a universal programming language, perhaps even supplanting C++ if users were to move from C++ class libraries to Java class libraries. In that case, Linux support for the Java class libraries would make Java an important player across several platforms.

So far Sun’s attempts to build a community of Java developers have not been successful: and Java developers have complained for years that Sun neither improves Java fast enough nor listens to their suggestions, concentrating instead on perfecting the Java Virtual Machine to run best on Sun equipment to the disadvantage of Intel-based developers. Sun’s experiments with Open Source have slowly improved, however, and in one instance include a Mozilla-style license. But instead of a single Java, there are several, depending on who has implemented it; and even Sun itself puts out different versions — including its embedded solution, PicoJava — in the hope of making Java a universal solution. The real way to universalize Java, however, would be to make it Open Source.

Technology markets know that the most important factor in the success of a technology is its rapid and wide dispersal so that it not only blocks any competitors, but also becomes a de facto standard. There is no better way to spread technology than to give it away. Even proprietary vendors admit privately that every unlicensed user is a supporter of their standard, and might become a licensed user someday (usually when support becomes vital); Lotus 1-2-3, for example, enjoyed this effect in the days when it was riding high. From what we have seen of forking in the Open Source world, a project leader (and Sun would become the leader of an enormous Java project) need not fear forking so long as the project members are getting what they want. Even the forked BSDs share code, and two of the BSD projects have just announced a planned code merge. The rate of improvement of Open Source Java would no longer be limited by the number of developers that Sun dedicated to it.

Open Source Java?

How would Microsoft respond to a Java made Open Source? Microsoft could ignore Java, having already moved on to building its own distributed object technology and its own Internet standards. Or Microsoft could again combine Java with its own technology, thus forking Java. So far we have seen both choices in action, and while they made Java stumble on its march to becoming a de facto standard, there is still plenty of Java activity: it simply retreated to the server (Microsoft’s next target, of course). Microsoft could not make its own Java or anti-Java technology Open Source because it likes to tie everything into the operating system on which it bases its domination of the markets; Microsoft can’t afford to reveal its operating system secrets as Open Source. That leaves only this question: Could Microsoft’s in-house developers, competing against Sun’s developers plus the enthusiastic Java developers of the world (including many at large corporations, such as IBM) produce a better and more acceptable product than an Open Source Sun Java project could?

Sun’s hardware position is probably the chief obstacle to a Java Open Source strategy. So far Sun’s control of Java development has meant that Java has been optimized for Sun machines. All of Sun’s software innovations, from Solaris to Java, have been aimed at selling Sun hardware. Sun might well fear that an Open Source Java Windows/Intel project, even under its leadership, would cut into Sun machine sales.

Standards in the Linux world

Meanwhile Linux is moving rapidly forward on the authority of a single project leader. There are some 100 – 150 distros in circulation, and their number will grow as countries prepare their own national distributions. While Linux was still a developer’s hobby, no one gave much thought to its being in English: it was after all a UNIX clone and was written by a Swedish-speaking Finn who has no difficulties with English, which has become a universal language among developers. As Linux gains wider use, however, national versions are necessary for any wide use of Linux and its applications. Microsoft, for that matter, has a head start in accommodating other scripts and languages. Not only is there a rising number of localized Linux distros, there are competing localized distros in some countries.

Red Hat and everyone else

The main force in the commercial Linux world right now is Red Hat, with a U.S. market share estimated by IDC (in early 2000) to be 44 percent. Evans Marketing Services surveyed North America and Europe and estimated a 60 percent share. Evans (and some other sources) place the Mandrake-Linux distribution in second place, with 9 percent. SuSE is still the leader in Europe (at the time of Red Hat’s IPO, SuSE claimed larger worldwide sales than Red Hat). Whatever the exact figures, Red Hat is clearly the distro leader. The discomfort caused by Red Hat’s large influence has caused a rise of interest in the Debian distribution, worked on by a body of volunteers whose only commercial outlet was (until recently) Walnut Creek CD-ROM.

Even in the face of Red Hat’s dominance in the distro market, a shoestring start-up can still become an important player. But Mandrake, as we saw earlier, is the Red Hat distribution, repackaged with a couple of changes. Because Red Hat refused to include non-Open Source software with its basic distribution, it rejected the proprietary-based KDE desktop in favor of the GPL’d GNOME. The Mandrake distribution moved into the niche that Red Hat opened, and came to its initial fame as the Red Hat distro that used KDE (later, Mandrake tuned its distro for the Pentium chip, while Red Hat was still based on the 486, and is currently being praised for its easy installation software). The odd effect of Mandrake’s sudden rise to popularity is that it has entrenched Red Hat even further as a de facto standard among the distros; MandrakeSoft makes it clear that it is "compatible with Red Hat Linux." Besides Mandrake, there are at least a dozen distros likewise based on Red Hat, which means that Red Hat as a standard dominates probably some two-thirds of the distro market (and by one surprising estimate, 85 percent worldwide).

Before we look at what that de facto standard implies, let’s recall the unique value proposition of Red Hat. It is the Red Hat brand, obtainable from no other source. True, there are $2 disks containing the latest Red Hat distribution (seen from the Red Hat point of view as "free" samples, because Red Hat is not paying for them) that only serve to spread the Red Hat standard. Mandrake has in one sense followed Red Hat in this strategy of spreading its technology widely to build its brand: Once Red Hat was firmly seated in the leading place among distributions, it terminated the licensing of its trademarks to Macmillan for Macmillan’s Red Hat distribution because the deal was unremunerative. Mandrake-Linux immediately stepped into the Macmillan slot, probably gaining little or no revenue, but having its trademarks and brand spread widely through Macmillan retail channels originally closed to both vendors. Now that Mandrake has gained high market share, the deal has been restructured: Mandrake is dropping its own boxed distro of Linux-Mandrake in the United States, and relying solely on Macmillan for boxed sales. One study of retail sales of boxed Linux indicated a 36 percent share for Red Hat, a 31 percent share for Macmillan’s Linux-Mandrake, and 0.1 percent share for the original Linux-Mandrake, which is, by the way, a leader among the distros downloaded from the software site.

There is one other component to the Red Hat brand and unique value proposition. Red Hat is adamant that its products be Open Source. Red Hat made the Cygnus products it acquired Open Source, and the only part of the recently acquired Hell’s Kitchen technology that has not been made Open Source is the portion that processes credit cards, because banking security regulations require that the source code not be exposed. Red Hat used its completely Open Source status as a strategy, particularly with its Red Hat Package Manager (RPM), the software used to load software so that the operating system can find it. Not only did Red Hat freely distribute the source code, it also paid to have a programming book written about the RPM, and then placed the book under an Open Source publishing license.

Most of the other major distributions restrict in some manner the (re)distribution of the extra software they include for the convenience of the user; depending on the licensing, this practice can prevent the spread of $2 disks, or even force the buyer to install a purchased box on each machine. Mandrake, notably, follows the Red Hat model in not restricting the distribution of the improvements it makes in its distribution. Red Hat attributes its market share principally to this practice (as well as to branding and aggressive marketing, especially in hardware OEM sales and as a platform for ISVs), and regards the restrictions practiced by other distributions as necessarily limiting them to niche markets, rather than to the total Linux market. By playing the proprietary Microsoft game, these distributions are hurting their own chances for wider use.

This is not to say that niche markets are not viable. Caldera is an example. Originally resented by the Linux Community because it targeted business users, and because its sales message was that Linux was the glue for heterogeneous systems (Linux fans believed that Caldera was selling Linux short), its strategy has been vindicated by the general market assessment of Linux: for business rather than for the general user, and most useful for its networking capabilities. Caldera has worked to simplify the usual large Linux distributions by dividing its distros into a workstation product and a server product. It is aiming at smaller businesses rather than large corporations, and at reaching them through small integrators. Interestingly, IDC’s recent preliminary figures on where Linux is used show that small businesses use it slightly more than large businesses, although one would expect that large businesses, with their greater UNIX expertise, would be a natural leader. According to IDC, 47 percent of companies with under 100 employees have one or more Linux servers, compared to 42.1 percent of medium and large companies. Nevertheless, Caldera has a far smaller market share than Red Hat.

The Red Hat standard and the Linux standard base

The press likes to discuss the possibility of forking in Linux, but the question is a little misleading. Properly, Linux is a monolithic kernel, controlled by Linus Torvalds. It is true that anyone can take the kernel source code and put out his own version of the kernel, but no one has seen the point to doing this. Surrounding the Linux kernel is a large number of software libraries, utilities, and other useful material referred to as packages, largely borrowed from GNU and BSD; for this reason Richard Stallman and some others prefer to call the operating system GNU/Linux. The operating system, consisting of selected packages tuned to work with the chosen version of the kernel, constitute a distribution. The choice of the packages in the distribution, and the versions of those various packages, are the choice of the distribution preparer, and may vary accordingly. In this sense, Linux has already forked (slightly), and no matter how great the centripetal force of Open Source, there will always be a place for specialized distributions.

The real concern of the Community over "forking" concerns binary applications, particularly those that ship without accompanying source code. The point of Open Source is to provide source code so that users can compile it from source to binary (running) form to suit their own situation, but it is growing increasingly unrealistic to expect this situation as the number of nondeveloper users increases, along with the number of proprietary applications that keep their source code secret.

As a result the Community frequently mentions the need for a standard Linux — embodied in the Linux Standard Base (LSB) of the Free Standards Group at — because of difficulties in making sure that applications install smoothly atop the various Linux distributions. This approach would ensure application portability not so much by the freedom to recompile but by the setting of standards for all distributions to (voluntarily) follow. The need for application portability demonstrates the strength of the de facto Red Hat standard in that (a) many applications aim themselves at Red Hat as a platform, and (b) Red Hat with its large market share does not worry about Linux forking, nor does it feel an urgent need for the LSB.

In the first place, Red Hat with its lion’s share knows that other distributions are more likely to have to adjust to it than the other way around, and that the Red Hat way of doing things will have a strong influence on the definition of the Linux Standard Base, which is still a long way from its first release (expected by the end of 2000). In the second place, Red Hat believes that Open Sourcing its innovations such as the Red Hat Package Manager will induce other distributions to adopt them, and that it can always pull other Open Source technologies into the Red Hat distribution if it wants them. Related to this powerful argument against forking is the fact that a widely distributed distro creates a network effect for its users, just as Microsoft Windows and Office have done. This is the argument that people will adopt not the best technology, but the one most widely spread because they want to be able to interoperate with the largest number of people. There are plenty of Microsoft Word users who will tell you how much they prefer WordPerfect, but they are not using the product because the network effect of Office is too great. Linux users who prefer Red Hat for its network effect also have the consolation that if they are missing anything really great, Red Hat will eventually include it. People fear forking in Linux because they remember its model, UNIX, but UNIX forked precisely because it was proprietary and subdivided itself into proprietary niches, each presided over by a hostile vendor. UNIX forking is the natural result of a proprietary system, while Linux unity is the natural result of the GNU GPL.

Red Hat, although unconcerned with the potential of forking and not feeling a particular need of the LSB, has nevertheless pledged its cooperation when the LSB standard does finally emerge. Without going into depth on all the technical issues at stake, a look at a couple of simple problems will give you an idea of what the issues are.

The most commonly mentioned disparity among distributions used to be the package-management system, the utility that determines which packages the kernel will be using. There are two main systems, the older one originating in the Debian distribution, and the newer Red Hat Package Manager (RPM); each expects to find the packages prepared in a manner peculiar to their respective distributions. In the days when the market shares of the two distributions were more evenly matched, there used to be more argument about two package managers; now that the RPM is so widely spread among the distributions, one hears mostly sighs about the things that Debian does better, and a wish that the RPM would pick them up. There are projects afoot to build a newer, better package manager, but even if one were to be devised, there is no guarantee that Red Hat (or Debian) would adopt it. The most likely solution will be a (necessarily large) package manager that handles both Debian and Red Hat packages.

The distributions most frequently compete on their installation utility: some users prefer more control, and others want a utility that will effortlessly make the right decisions. Mandrake and Corel are currently praised for their ease of installation. There is a tendency for these installation utilities to organize the files in a manner peculiar to the distribution, so that users (or applications) might not find the expected fonts, drivers, or libraries where they were expecting them. As a result some applications play it safe by establishing their own locations for fonts (for instance), and a user may end up with several sets of the same fonts distributed in different places around his system. In some cases users may find that newly-installed applications overwrite some system files with variant versions, causing other applications to fail.

Windows users will recognize this as the familiar "DLL Hell," the result of having different versions of identically-named executable libraries (.dll files) overwriting each other as each application stows them where it pleases. Similarly, Linux applications (or even the kernel) looking for particular versions of particular libraries may find themselves defeated and unable to run.

The LSB aims to regulate the interface between the kernel and everything else, that is, the packages and applications, by specifying minimal standards. One specification is the Filesystem Hierarchy Standard (FHS), whose latest release is 2.1. The LSB says that the major distributions have either implemented it or its earlier versions, so some progress is apparently being made along these lines. As with all standards bodies, the progress has been very slow, and there is the continuing tension between rapid technological progress and innovation (born generally of competitive pressure), and the slow setting and adoption of standards. The LSB would do nothing to govern or prevent (for instance) proprietary installation systems, provided their results conformed to the LSB standard; one thing we can look for in the future is to see whether such proprietary innovations confer competitive advantage (since they cannot be picked up by other distributions, and especially by the market leader), or whether they isolate the proprietor from the larger market. The BSD varieties are feeling market pressure to draw closer together, and have begun joint marketing at trade shows under the badge of the red Berkeley Daemon.

The shibboleth of UNIX flavors has always been POSIX, and from the beginning Linus Torvalds aimed Linux at the POSIX.1 standard because he wanted UNIX applications to run smoothly on Linux (in fact, many UNIX applications run with no problem once they are recompiled for Linux). But he could not afford the expensive testing needed for POSIX certification. This may have been just as well, because there is a faction in the Community that regards POSIX as a relic of UNIX ("This [Linux] is not your father’s UNIX"), and a few years back there was a distro (Linux-FT) that did uniquely achieve POSIX certification, but it passed from sight. The technology was acquired by Caldera and incorporated into its Linux products, but (like other distributions), Caldera is POSIX-compliant but not certified. Torvalds himself dislikes the POSIX standard for threading (along with some post-POSIX.1 standards) and has implemented his own native Linux threading; it does, however, allow the POSIX "p-threads" or other threading systems to be used on top of it.

On the whole, the condition of standards in the Linux world is no worse than in any other software orbit, and may likely be better. UNIX certainly provided no model for unified naming of files or organizing of file systems because of its proprietary fragmentation; for that matter, even POSIX-certified UNIX flavors differed from each other to the point that applications would not run at all on other systems, and needed more than a simple recompiling to make them do so. The mainstream distributions of Linux are much closer than that.

Follow That Lizard! The Mozilla Project

The large and high-profile project has served for two years now as an object lesson for commercial firms entering the Open Source world. It is hardly the failure it was called by some (including one of its departing leaders) a year ago, and we now have the opportunity of seeing where it goes from here, as it enters its first beta or "pre-release" stage.

Under increasing pressure from Microsoft’s Internet Explorer (IE) browser, Netscape decided to call on the resources of the Open Source community to rapidly improve and out-evolve the Microsoft browser. At the time many called the decision an act of desperation by a company with nothing to lose (less than a year later it was acquired by AOL). Internally, the argument was driven by the usual Open Source arguments and revolved around this premise: At the price of opening up its software and letting outsiders use the technology, Netscape would end up with a world-beating browser. The Internet, after all, was a place of open standards and run on Open Source software; the fact that the Netscape browser revealed to the user the source code of every page it touched, and the heavy use of Java pushed the argument in the same direction. It is very hard for Java developers to hide what they are doing, and proprietary vendors resort to scrambling the code ("obfuscation") to discourage the curious.

It was fortunate that as an Internet company Netscape had emphasized cross-platform products and had a Linux version (although Linux fans complained that it was difficult to find and that most employees seemed to know nothing about it!). Although Microsoft’s Internet Explorer 4 also had a UNIX version, it was very much in Netscape’s interest to support a heterogeneous market, just as it was in Microsoft’s interest to kill other platforms. Netscape was able to appeal to the Open Source community for help by pointing to the Linux branch of the Mozilla product.

Netscape chose not to release the code of its current Communicator version, but instead to open up the code of the next version then under development, Communicator 5. Because Java is proprietary to Sun, it had to be removed from the source code, along with other proprietary material and the encryption technology (per federal regulations). One difficult job was to clean up the code so that outsiders would find it easier to work with; even more difficult was the job of preparing a license that would protect all the parties involved, even as the code was being opened.

The license itself — the Netscape Public License (NPL), see Chapter 7, "Complications of Open Source Licensing" — came under fire from many in the Open Source community, chiefly because it allowed Netscape to take private any contributed material and to issue it under other licenses (for example, to provide to existing licensees of Netscape source code). As a consequence, Netscape set up a more neutral scheme involving the Mozilla Public License (MozPL) to use as an alternative. The strongest reaction of the Community, however, was a lack of interest in participating in the project, whether from the licensing structure or from the complexity of the code itself. Consequently, the Mozilla project remained as it was perceived: principally a Netscape project.

The Community had mixed reactions to the code itself. Some said that it was not the Communicator 4 they were used to working with; in addition, the stripping out of third-party technology, cryptography, and news reader made the code less than interesting, and so these developers stayed away. Other Community members, however, while also rejecting Communicator 5 code, said that what they really liked was another Netscape project, Raptor (today called Gecko), a more standards-compliant HTML layout engine that could form the foundation of a browser. These people urged that the Mozilla project, whose technology was to serve as the basis for the next Communicator release (eventually to appear as Communicator 6), be designed around Gecko, and that the underlying technology fully support official World Wide Web Consortium (W3C) standards such as HTML4, XML, CSS, and DOM ( Although the Community did little coding, they did contribute some key features, and the architectural expertise and the demand for open standards were benefits that could not have been supplied internally, particularly because the changes required added nearly a year to the length of the project. At the end of the first year, with no released product in sight, Mozilla’s best-known project leader and evangelist walked away, convinced of failure.

Some time after his departure, the shape and benefits of the small Gecko engine became more apparent (it was embeddable), and interested corporations began to participate in the project. The number of outside developers grew, but they were not necessarily the midnight hackers of Community legend; many of them came from corporations intending to adopt the technology, and a number of them were put on the work full-time by their employers. Information appliances can embed Gecko on top of an operating system (such as Linux); this is the approach of AOL TV, which will combine Gecko with interactive TV technology from Liberate Technology (the former Network Computer at Nokia will also put out a set-top box based on Intel and using Linux and Gecko. Gecko runs on Windows, Linux, and Mac OS; Nexware Corporation ( is porting Mozilla technology to the embedded QNX system ( and its interface, Photon, and the Mozilla project anticipates other UNIX ports.

Whether embedded in appliances or running on desktop clients, Gecko will make possible advanced Web-based applications. With Gecko as the foundation of the Communicator 6 browser, for instance, developers will be able to build applications on top of the browser, just as they can now atop Microsoft’s Internet Explorer. Because the technology underneath is based on Web standards, the application will be able to interoperate with a large variety of software and also take advantage of Internet communications; a hand-held inventory bar-code reader, for instance, could transmit updated inventory material to a distant site, and also look up unfamiliar bar codes stored at another distant location. RealNetworks is using Mozilla technology to build a private-label media player for Global Media ( that will also display HTML and Macromedia Flash ( Companies are free to follow their wishes in innovation, and there are no fees or royalties for the use of Gecko.

How Cool is XUL?

One feature of Mozilla undergoing continuing development is XUL, pronounced "Zool," the XML-based User Interface Language that controls the user interface itself. XUL ( is more than a "skin" on a fixed interface; following the UNIX tradition, the interface itself is separate from the engine. By editing XUL a developer (or very advanced user) can add, subtract, or move controls.

Anyone is free to build a browser based on Gecko; so far only NeoPlanet ( has done so, as an unsupported product that enables viewers to switch between Netscape and Microsoft views, a practice useful for developers who are trying to develop for both browsers. It is an Open Source hope that proper implementation of open Web standards will make it easier for Web sites to develop a single version of their material, rather than preparing two versions for Netscape and for Internet Explorer, as is so often the case today. Although browser share is currently dominated by Microsoft (80 percent), Gecko’s release should refocus attention on supporting open standards and give users a better choice than the current Netscape Communicator. Netscape’s owner, AOL, will adopt some of the Mozilla technology, but is not expected to substitute Communicator 6 for Internet Explorer in its client software because its current use of IE assures AOL of a preinstalled client in preinstalled Windows systems, including an AOL button on the Windows desktop. This is too good a source of new customers for an aggressively growing company to give up. In addition, the customer base of AOL is hardly the experimental, technology-loving crowd that rushes to download new Netscape betas or eagerly anticipates Mozilla.

Netscape began by pushing technology beyond the standards to gain competitive advantage against Mosaic. Microsoft outdid Netscape at this game by making its browser dependent on proprietary technology, and outperforming Communicator 4. Netscape now believes that the new Communicator 6 will be smaller, faster, and more functional than Internet Explorer, and will achieve this through supporting open standards. The next year will show which side actually achieves better support and wider acceptance, and whether Netscape’s faith in Open Source and outside developers was justified.

Next chapter: Intellectual Property

Table of Contents

Copyright © 2000 by IDG Books Worldwide, Inc. Published under IDG Books Worldwide, Inc. Open Content License.