Ergebnis für URL: http://www.dwheeler.com/oss_fs_why.html
   Translations available: [1]Czech | [2]French | [3]Japanese | [4]Spanish
     ____________________________________________________________________________

Why Open Source Software / Free Software (OSS/FS, FLOSS, or FOSS)? Look at the Numbers!

                                   David A. Wheeler

                         [5]http://dwheeler.com/contactme.html

                              Revised as of July 18, 2015

   This paper (and [6]its supporting database) provides quantitative data that, in
   many cases, using [7]open source software / free software (abbreviated as OSS/FS,
   FLOSS, or FOSS) is a reasonable or even superior approach to using their
   proprietary competition according to various measures. This paper's goal is to
   show that you should consider using OSS/FS when acquiring software. This paper
   examines [8]popularity, [9]reliability, [10]performance, [11]scalability,
   [12]security, and [13]total cost of ownership. It also has sections on
   [14]non-quantitative issues, [15]unnecessary fears, [16]OSS/FS on the desktop,
   [17]usage reports, [18]governments and OSS/FS, [19]other sites providing related
   information, and ends with some [20]conclusions. An [21]appendix gives more
   background information about OSS/FS. You can view this paper at
   [22]http://dwheeler.com/oss_fs_why.html (HTML format). [23]A short presentation
   (briefing) based on this paper is also available. Palm PDA users may wish to use
   [24]Plucker to view this longer report. [25]Old archived copies and a list of
   [26]changes are also available.

                                     1. Introduction

   [27]Open Source Software / Free Software (aka OSS/FS), also described as
   Free/Libre and Open Source Software (FLOSS), has risen to great prominence.
   Briefly, FLOSS programs are programs whose licenses give users the freedom to run
   the program for any purpose, to study and modify the program, and to redistribute
   copies of either the original or modified program (without having to pay
   royalties to previous developers).

   The goal of this paper is to convince you to consider using FLOSS when you're
   looking for software, using quantitive measures. Some sites provide a few
   anecdotes on why you should use FLOSS, but for many that's not enough information
   to justify using FLOSS. Instead, this paper emphasizes quantitative measures
   (such as experiments and market studies) to justify why using FLOSS products is
   in many circumstances a reasonable or even superior approach. I should note that
   while I find much to like about FLOSS, I'm not a rabid advocate; I use both
   proprietary and FLOSS products myself. Vendors of proprietary products often work
   hard to find numbers to support their claims; this page provides a useful
   antidote of hard figures to aid in comparing proprietary products to FLOSS.
   Others have come to the same conclusions, for example, [28]Forrester Research
   concluded in September 2006 that "Firms should consider open source options for
   mission-critical applications".

   I believe that this paper has met its goal; others seem to think so too. [29]The
   2004 report of the California Performance Review, a report from the state of
   California, urges that "the state should more extensively consider use of open
   source software", and specifically references this paper. [30]A review at the
   Canadian Open Source Education and Research (CanOpenER) site stated "This is an
   excellent look at the some of the reasons why any [organization] should consider
   the use of [FLOSS]... [it] does a wonderful job of bringing the facts and figures
   of real usage comparisons and how the figures are arrived at. No FUD or paid for
   industry reports here, just the facts". This paper been referenced by many other
   works, too. It's my hope that you'll find it useful as well.

   The following subsections describe the paper's [31]scope, [32]challenges in
   creating it, the paper's [33]terminology, and the [34]bigger picture. This is
   followed by a [35]description of the rest of the paper's organization (listing
   the sections such as [36]popularity, [37]reliability, [38]performance,
   [39]scalability, [40]security, and [41]total cost of ownership). Those who find
   this paper interesting may also be interested in the [42]other documents
   available on David A. Wheeler's personal home page. [43]A short presentation
   (briefing) based on this paper is also available.

   This paper has become long, there is now [44]a supporting database of OSS/FS
   (FLOSS) quantitative studies that you may find easier to use. You may also be
   interested in the [45]discussion group for quantitative numbers about free /
   libre / open source software.

1.1 Scope

   As noted above, the goal of this paper is to convince you to consider using FLOSS
   when you're looking for software, using quantitive measures. Note that this
   paper's goal is not to show that all FLOSS is better than all proprietary
   software. Certainly, there are many who believe this is true from [46]ethical,
   moral, or social grounds. It's true that FLOSS users have fundamental control and
   flexibility advantages, since they can modify and maintain their own software to
   their liking. And some countries perceive advantages to not being dependent on a
   sole-source company based in another country. However, no numbers could prove the
   broad claim that FLOSS is always "better" (indeed you cannot reasonably use the
   term "better" until you determine what you mean by it). Instead, I'll simply
   compare commonly-used FLOSS software with commonly-used proprietary software, to
   show that at least in certain situations and by certain measures, some FLOSS
   software is at least as good or better than its proprietary competition. Of
   course, some FLOSS software is technically poor, just as some proprietary
   software is technically poor. And remember -- even very good software may not fit
   your specific needs. But although most people understand the need to compare
   proprietary products before using them, many people fail to even consider FLOSS
   products, or they create policies that unnecessarily inhibit their use; those are
   errors this paper tries to correct.

   This paper doesn't describe how to evaluate particular FLOSS programs; a
   [47]companion paper describes how to evaluate FLOSS programs. This paper also
   doesn't explain how an organization would transition to an FLOSS approach if one
   is selected. Other documents cover transition issues, such as [48]The Interchange
   of Data between Administrations (IDA) Open Source Migration Guidelines (November
   2003) and the German [49]KBSt's Open Source Migration Guide (July 2003) (though
   both are somewhat dated). [50]Organizations can transition to FLOSS in part or in
   stages, which for many is a more practical transition approach.

   I'll emphasize the operating system (OS) known as GNU/Linux (which many
   abbreviate as [51]"Linux"), the [52]Apache web server, the [53]Mozilla Firefox
   web browser, and the [54]OpenOffice.org office suite, since these are some of the
   most visible FLOSS projects. I'll also primarily compare FLOSS software to
   Microsoft's products (such as Windows and IIS), since Microsoft Windows is widely
   used and Microsoft is one of proprietary software's strongest proponents. Note,
   however, that even Microsoft makes and uses FLOSS themselves (they have even
   [55]sold software using the GNU GPL license, as discussed below).

   I'll mention Unix systems as well, though the situation with Unix is more
   complex; today's Unix systems include many FLOSS components or software primarily
   derived from FLOSS components. Thus, comparing proprietary Unix systems to FLOSS
   systems (when examined as whole systems) is often not as clear-cut. This paper
   uses the term "Unix-like" to mean systems intentionally similar to Unix; both
   Unix and GNU/Linux are "Unix-like" systems. The most recent Apple Macintosh OS
   (MacOS OS X) presents the same kind of complications; older versions of MacOS
   were wholly proprietary, but Apple's OS has been redesigned so that it's now
   based on a Unix system with substantial contributions from FLOSS programs.
   Indeed, [56]Apple is now openly encouraging collaboration with FLOSS developers.

1.2 Challenges

   It's a challenge to write any paper like this; measuring anything is always
   difficult, for example. Most of these figures are from other works, and it was
   difficult to find many of them. But there are some special challenges that you
   should be aware of: legal problems in publishing data, the reluctance of many
   FLOSS users to publicly admit it (for fear of retribution), and dubious studies
   (typically those funded by a product vendor).

   [57]Many proprietary software product licenses include clauses that forbid public
   criticism of the product without the vendor's permission. Obviously, there's no
   reason that such permission would be granted if a review is negative -- such
   vendors can ensure that any negative comments are reduced and that harsh
   critiques, regardless of their truth, are never published. This significantly
   reduces the amount of information available for unbiased comparisons. Reviewers
   may choose to change their report so it can be published (omitting important
   negative information), or not report at all -- in fact, they might not even start
   the evaluation. Some laws, such as [58]UCITA (a law in Maryland and Virginia),
   specifically enforce these clauses forbidding free speech, and in many other
   locations the law is unclear -- making researchers bear substantial legal risk
   that these clauses might be enforced. These legal risks have a chilling effect on
   researchers, and thus makes it much harder for customers to receive complete
   unbiased information. This is not merely a theoretical problem; [59]these license
   clauses have already prevented some public critique, e.g., Cambridge researchers
   reported that [60]they were forbidden to publish some of their benchmarked
   results of VMWare ESX Server and Connectix/Microsoft Virtual PC. Oracle has had
   such clauses for years. Hopefully these unwarranted restraints of free speech
   will be removed in the future. But in spite of these legal tactics to prevent
   disclosure of unbiased data, there is still some publicly available data, as this
   paper shows.

   Another problem is that many users of FLOSS are reluctant to admit it. [61]ZDNet
   UK's November 25, 2005 article "Why open source projects are not publicised" by
   Ingrid Marson examines this. For example, it notes that many are afraid of
   retribution. Obviously, this makes some data more difficult to obtain.

   This paper omits or at least tries to warn about studies funded by a product's
   vendor, which have a fundamentally damaging conflict of interest. Remember that
   [62]vendor-sponsored studies are often rigged (no matter who the vendor is) to
   make the vendor look good instead of being fair comparisons. [63]Todd Bishop's
   January 27, 2004 article in the Seattle Post-Intelligencer Reporter discusses the
   serious problems when a vendor funds published research about itself. A study
   funder could directly pay someone and ask them to directly lie, but it's not
   necessary; a smart study funder can produce the results they wish without,
   strictly speaking, lying. For example, a study funder can make sure that the
   evaluation carefully defines a specific environment or extremely narrow question
   that shows a positive trait of their product (ignoring other, probably more
   important factors), require an odd measurement process that happens show off
   their product, seek unqualified or unscrupulous reviewers who will create
   positive results (without careful controls or even without doing the work!),
   create an unfairly different environment between the compared products (and not
   say so or obfuscate the point), require the reporter to omit any especially
   negative results, or even fund a large number of different studies and only allow
   the positive reports to appear in public. [64]Microsoft's James Plamodon urged
   Microsoft employees to perform various manipulative practices, recommending that
   during "the Slog" of competition they "[work] behind the scenes to orchestrate
   "Independent" praise of our technology, and damnation of the enemy's....
   "Independent" analysts' report should be issued... "Independent" consultants
   should write columns and articles, give conference presentations and moderate
   stacked panels, all on our behalf (and setting themselves up as experts in the
   new technology, available for [lucratively high prices])... "Independent"
   academic sources should be cultivated and quoted (and research money granted)".
   The song "Meat the Press" by Steve Taylor eloquently expresses this kind of
   deception: "They can state the facts while telling a lie".

   This doesn't mean that all vendor-funded studies are misleading, but many are,
   and there's no way to be sure which studies (if any) are actually valid. For
   example, Microsoft's "get the facts" campaign identifies many studies, but nearly
   every study is entirely vendor-funded, and I have no way to determine if any of
   them are valid. After a pair of vendor-funded studies were publicly lambasted,
   [65]Forrester Research announced that it will no longer accept projects that
   involve paid-for, publicized product comparisons. One ad, based on a
   vendor-sponsored study, was [66]found to be misleading by the UK Advertising
   Standards Authority (an independent, self-regulatory body), who [67]formally
   adjudicated against the vendor. This example is important because the study was
   touted as being fair by an "independent" group, yet it was found unfair by an
   organization who examines advertisements; failing to meeting the standard for
   truth for an advertisement is a very low bar.

   [68]Steve Hamm's BusinessWeek article "The Truth about Linux and Windows" (April
   22, 2005) noted that far too many reports are simply funded by one side or
   another, and even when they say they aren't, it's difficult to take some
   seriously. In particular, he analyzed a report by the Yankee Group's Laura DiDio,
   asking deeper questions about the data, and found many serious problems. His
   article explained why he just doesn't "trust its conclusions" because "the work
   seems sloppy [and] not reliable" ([69] a Groklaw article also discussed these
   problems).

   Many companies fund studies that place their products in a good light, not just
   Microsoft, and the concerns about vendor-funded studies apply equally to vendors
   of FLOSS products. I'm independent; I have received no funding of any kind to
   write this paper, and I have no financial reason to prefer either FLOSS or
   proprietary software. I recommend that you prefer studies that do not have
   financial incentives for any particular outcome.

   This paper includes data over a series of years, not just the past year; all
   relevant data should be considered when making a decision, instead of arbitrarily
   ignoring older data. Note that the older data shows that FLOSS has a history of
   many positive traits, as opposed to being a temporary phenomenon.

1.3 Terminology and Conventions

   You can see more detailed explanation of the terms "open source software" and
   "Free Software", as well as related information, in [70]the appendix and my
   [71]list of Open Source Software / Free Software (OSS/FS or FLOSS) references at
   http://dwheeler.com/oss_fs_refs.html. Note that those who use the term [72]"open
   source software" tend to emphasize technical advantages of such software (such as
   better reliability and security), while those who use the term [73]"Free
   Software" tend to emphasize freedom from control by another and/or ethical
   issues. The opposite of FLOSS is "closed" or "proprietary" software.

   Other alternative terms for FLOSS, besides either of those terms alone, include
   "libre software" (where libre means free as in freedom), "livre software" (same
   thing), free/libre and open-source software (FLOSS), open source / Free Software
   (OS/FS), free / open source software (FOSS or F/OSS), open-source software
   (indeed, "open-source" is often used as a general adjective), "freed software,"
   and even "public service software" (since often these software projects are
   designed to serve the public at large). I recommend the term "FLOSS" because it
   is easy to say and directly counters the problem that "free" is often
   misunderstood as "no cost". There are other ways to expand FLOSS, including
   Free-Libre and Open Source Software and Free/Libre/Open Source Software.

   Software that cannot be modified and redistributed without further limitation,
   but whose source code is visible (e.g., "source viewable" or "open box" software,
   including "shared source" and "community" licenses), is not considered here since
   such software doesn't meet the definition of FLOSS. FLOSS is not "freeware";
   freeware is usually defined as proprietary software given away without cost, and
   does not provide the basic FLOSS rights to examine, modify, and redistribute the
   program's source code.

   A few writers still make the mistake of saying that FLOSS is "non-commercial" or
   "public domain", or they mistakenly contrast FLOSS with "commercial" products.
   However, today many FLOSS programs are commercial programs, supported by one or
   many for-profit companies, so this designation is quite wrong. Don't make the
   mistake of thinking FLOSS is equivalent to "non-commercial" software! Also,
   [74]nearly all FLOSS programs are not in the public domain. the term "public
   domain software" has a specific legal meaning -- software that has no copyright
   owner -- and that's not true in most cases. In short, don't use the terms "public
   domain" or "non-commercial" as synonyms for FLOSS.

   An FLOSS program must be released under some license giving its users a certain
   set of rights; [75]the most popular FLOSS license is the GNU General Public
   License (GPL). All software released under the GPL is FLOSS, but not all FLOSS
   software uses the GPL; nevertheless, some people do inaccurately use the term
   "GPL software" when they mean FLOSS software. Given the GPL's dominance, however,
   it would be fair to say that any policy that discriminates against the GPL
   discriminates against FLOSS.

   This is a large paper, with many acronyms. A few of the most common acronyms are:

       Acronym    Meaning
          GNU     GNU's Not Unix (a project to create an FLOSS operating system)
          GPL       GNU General Public License (the most common FLOSS license)
        OS, OSes               Operating System, Operating Systems
         FLOSS                  Open Source Software/Free Software

   This paper uses [76]logical style quoting (as defined by Hart's Rules and the
   Oxford Dictionary for Writers and Editors); quotations do not include extraneous
   punctuation.

1.4 Bigger Picture

   Typical FLOSS projects are, in fact, an example of something much larger:
   commons-based peer-production. The fundamental characteristic of FLOSS is its
   licensing, and an FLOSS project that meets at least one customer's need can be
   considered a success, However, larger FLOSS projects are typically developed by
   many people from different organizations working together for a common goal. As
   the declaration [77]Free Software Leaders Stand Together states, the business
   model of FLOSS "is to reduce the cost of software development and maintenance by
   distributing it among many collaborators". [78]Yochai Benkler's 2002 Yale Law
   Journal article, "Coase's Penguin, or Linux and the Nature of the Firm" argues
   that FLOSS development is only one example of the broader emergence of a new,
   third mode of production in the digitally networked environment. He calls this
   approach "commons-based peer-production" (to distinguish it from the property-
   and contract-based models of firms and markets).

   Many have noted that FLOSS approaches can be applied to many other areas, not
   just software. The Internet encyclopedia [79]Wikipedia, and works created using
   [80]Creative Commons licenses ([81]Yahoo! can search for these), are other
   examples of this development approach. [82]Wide Open: Open source methods and
   their future potential by Geoff Mulgan (who once ran the policy unit at 10
   Downing Street), Tom Steinberg, and with Omar Salem, discusses this wider
   potential. Many have observed that the process of creating scientific knowledge
   has worked in a similar way for centuries.

   FLOSS is also an example of the incredible value that can result when users have
   the [83]freedom to tinker (the freedom to understand, discuss, repair, and modify
   the technological devices they own). Innovations are often created by combining
   pre-existing components in novel ways, which generally requires that users be
   able to modify those components. This freedom is, unfortunately, threatened by
   various laws and regulations such as the U.S. DMCA, and the FCC "broadcast flag".
   It's also threatened by efforts such as [84]"trusted computing" (often called
   "treacherous computing"), whose goal is to create systems in which external
   organizations, not computer users, command complete control over a user's
   computer ([85]BBC News among others is concerned about this).

   Lawrence Lessig's [86]Code and Other Laws of Cyberspace argues that software code
   has the same role in cyberspace as law does in real space. In fact, he simply
   argues that "code is law", that is, that as computers are becoming increasingly
   embedded in our world, what the code does, allows, and prohibits, controls what
   we may or may not do in a powerful way. In particular he discusses the
   implications of "open code".

   All of these issues are beyond the scope of this paper, but the referenced
   materials may help you find more information if you're interested.

1.5 Organization of this Paper

   Below is data discussing [87]popularity, [88]reliability, [89]performance,
   [90]scalability, [91]security, and [92]total cost of ownership. I close with a
   brief discussion of [93]non-quantitative issues, [94]unnecessary fears, [95]FLOSS
   on the desktop, [96]usage reports, [97]other sites providing related information,
   and [98]conclusions. A closing [99]appendix gives more background information
   about FLOSS. Each section has many subsections or points. The non-quantitative
   issues section includes discussions about [100]freedom from control by another
   (especially a single source), [101]protection from licensing litigation,
   [102]flexibility, [103]social / moral / ethical issues, and [104]innovation. The
   unnecessary fears section discusses issues such as [105]support, [106]legal
   rights, [107]copyright infringement, [108]abandonment, [109]license
   unenforceability, [110]GPL "infection", [111]economic non-viability,
   [112]starving programmers (i.e., the [113]rising commercialization of FLOSS),
   [114]compatibility with capitalism, [115]elimination of competition,
   [116]elimination of "intellectual property", [117]unavailability of software,
   [118]importance of source code access, [119]an anti-Microsoft campaign, and
   [120]what's the catch. And the appendix discusses [121]definitions of FLOSS,
   [122]motivations of developers and developing companies, [123]history,
   [124]licenses, [125]FLOSS project management approaches, and [126]forking.

                                      2. Popularity

   Many people think that a product is only a winner if it is popular. This is
   lemming-like, but there's some rationale for this: products that have many users
   get applications built on top of them, trained users, and momentum that reduces
   future risk. Some writers argue against FLOSS or GNU/Linux as "not being
   mainstream", but if their use is widespread then such statements reflect the
   past, not the present. There's excellent evidence that many FLOSS products are
   popular:
    1. The most popular web server has always been FLOSS since such data have been
       collected. For example, Apache is the current #1 web server. [127]Netcraft's
       statistics on web servers have consistently shown Apache (an FLOSS web
       server) is the most popular web server ever since Apache grew into the #1 web
       server in April 1996. Before that time (from August 1995 through March 1996)
       the most popular web server was the NCSA web server (Apache's ancestor), and
       it is also FLOSS.
       [128]Netcraft's survey published May 2011 polled all the web sites they could
       find (totaling 324,697,205 sites), and found that of all the sites they could
       find, counting by name, 62.71% of web server ran Apache, while 18.37% using
       the Microsoft web server (these were the top two).
       However, many web sites have been created that are simply "placeholder" sites
       (i.e., their domain names have been reserved but they are not being used);
       such sites are termed "inactive". This means that just tracking the names can
       be misleading, and somewhat vulnerable to rigging. This eventually happened.
       In [129]April 2006 there was a one-time significant increase in IIS sites
       (versus Apache) among inactive sites, entirely due to a single company (Go
       Daddy) switching from Apache to IIS when serving inactive sites. While it is
       more difficult for a single active site to switch web servers, it is trivial
       for a hosting organization to switch all its inactive sites. Go Daddy's
       president and COO, [130]Warren Adelman, refused to discuss whether or not
       Microsoft paid or gave other incentives to move its inactive (parked) domains
       to Windows, leading a vast number of people to suspect that Go Daddy was paid
       by Microsoft to make this change, just to try to make Microsoft's popularity
       figures look better than they really were.
       Thus, since 2000, Netcraft has been separately counting [131]"active" web
       sites. Netcraft's count of only the active sites is arguably a more relevant
       figure than counting all web sites, since the count of active sites shows the
       web server selected by those who choose to actually develop a web site.
       Apache does extremely well when counting active sites; in the May 2011
       results, Apache had 57.52% of the web server market and Microsoft had 15.41%.
       [132]Netcraft's latest public SSL survey (Jan 2009) surveyed the number of
       web servers that encrypted their information using TLS/SSL. In short,
       "Netscape once dominated... Microsoft soon caught up and passed... [and now
       the] most popular choice of SSL web servers is the open source Apache
       server". Apache had about 45% of the market, and Microsoft had about 43%, and
       Microsoft's market share was clearly trending down.
       Years ago, Netcraft's September 2002 survey reported on websites based on
       their "IP address" instead of the host name; this has the effect of removing
       computers used to serve multiple sites and sites with multiple names. When
       counting by IP address, Apache has shown a slow increase from 51% at the
       start of 2001 to 54%, while Microsoft has been unchanged at 35%. Again, a
       clear majority.
       CNet's [133]"Apache zooms away from Microsoft's Web server" summed up the
       year 2003 noting that "Apache grew far more rapidly in 2003 than its nearest
       rival, Microsoft's Internet Information Services (IIS), according to a new
       survey--meaning that the open-source software remains by far the most widely
       used Web server on the Internet". The same happened in 2004, in fact, in just
       December 2004 Apache gained a full percentage point over Microsoft's IIS
       among the total number of all web sites.
       Apache's popularity in the web server market has been independently confirmed
       by [134]E-Soft's Security Space - their report on [135]web server market
       share published April 1st, 2007 surveyed 23,331,627 web servers in March 2007
       and found that Apache was #1 (73.29%), with Microsoft IIS being #2 (20.01%).
       [136]E-soft also reports specifically on secure servers (web servers
       supporting SSL/TLS, such as e-commerce sites); Apache leads there too, with
       52.49% of web servers using Apache, as compared to Microsoft's 39.32%. You
       can go to [137]http://www.securityspace.com for more information.
       [138]Netcraft has noted that by April 2007 some domains appear to be running
       lighthttpd, but claim to be running Apache instead. For this paper's purpose
       a lighttpd server claiming to be Apache does not harm the validity of the
       result, though. Both lighttpd and Apache are FLOSS, so the popularity of
       FLOSS web servers would be the sum of them (and other FLOSS web servers)
       anyway.
       Obviously these figures fluctuate monthly; see [139]Netcraft and [140]E-soft
       for their latest survey figures.
    2. Internet Explorer has been losing popularity to FLOSS web browsers (such as
       Mozilla Firefox) since mid-2004, a trend especially obvious in leading
       indicators such as technology sites, web development sites, and bloggers. PC
       World found that in [141]July 2004, Internet Explorer began to become
       measurably less popular compared to to FLOSS browsers. According to PC World,
       IE lost 1% of its popularity in a single month, July 2004. In the same time
       period Mozilla-based browser use increased by 26%. IE was still far more
       widely used at this time according to this July 2004 poll (94.73%), but IE
       hadn't lost market share for many years, and it takes a significant event for
       that many people to change browsers. This was probably at least in part due
       to [142]repeated security problems (though its poor support of web standards
       and lack of features may also have had a role). Note that the major Mozilla
       rewrite of its web browser, [143]Mozilla Firefox, wasn't even officially
       available at time; Firefox wasn't officially released until November 9, 2004.
       A multitude of studies show that IE is losing popularity, while FLOSS web
       browsers (particularly Firefox and Chrome) are gaining popularity. The figure
       above shows web browser market share over time; the red squares are Internet
       Explorer's market share (all versions), and the blue circles are the
       combination of the older Mozilla suite and the newer Mozilla Firefox web
       browser (both of which are FLOSS).
       FLOSS web browsers (particularly Firefox) are gradually gaining market share
       among the general population of web users. [144]By November 1, 2004, Ziff
       Davis revealed that IE had lost about another percent of the market in only 7
       weeks. [145]Chuck Upsdell has combined many data sources and estimates that,
       as of September 2004, IE has decreased from 94% to 84%, as users switch to
       other browser families (mainly Gecko); he also believes this downward trend
       is likely to continue. [146]Information Week reported in March 18, 2005, some
       results from Net Applications (a maker of Web-monitoring software). Net
       Applications found that Firefox use rose to 6.17% of the market in February
       2005, compared to 5.59% in January 2005. [147]WebSideStory reported in
       February 2005 that Firefox's general market share was 5.69% as of February
       18, 2005, compared to IE's 89.85%. [148]OneStat reported on February 28,
       2005, that Mozilla-based browsers' global usage share (or at least Firefox's)
       is 8.45%, compared to IE's 87.28%. Co-founder Niels Brinkman suspects that IE
       5 users were upgrading to Firefox, not IE 6, as at least one reason why
       "global usage share of Mozilla's Firefox is still increasing and the total
       global usage share of Microsoft's Internet Explorer is still decreasing". The
       site [149]TheCounter.com reports global statistics about web browsers;
       [150]February 2005 shows Mozilla-based browsers (including Firefox, but not
       Netscape) had 6%, while IE 6 had 81% and IE 5 had 8% (89% total for IE). This
       is a significant growth; the [151]August 2004 study of 6 months earlier had
       Mozilla 2%, IE 6 with 79%, and IE 5 with 13% (92% for IE). The website
       [152]quotationspage.com is a popular general-use website; [153]quotationspage
       statistics of February 2004 and 2005 show a marked rise in the use of FLOSS
       browsers. In February 2004, IE had 89.93% while Mozilla-based browsers
       accounted for 5.29% of browser users; by February 2005, IE had dropped to
       76.47% while Mozilla-based browsers (including Firefox) had risen to 14.11%.
       [154]Janco Associates also reported Firefox market share data; comparing
       January 2005 to April 2005, Firefox had jumped from 4.23% to 10.28% of the
       market (IE dropped from 84.85% to 83.07% in that time, and Mozilla, Netscape,
       and AOL all lost market share in this time as well according to this survey).
       [155]Nielsen/NetRatings' survey of site visitors found that in June 2004,
       795,000 people visited the Firefox website (this was the minimum for their
       tracking system). There were 2.2 million in January 2005, 1.6 million in
       February, and 2.6 million people who visited the Firefox web site in March
       2005. The numbers were also up for Mozilla.org, the Web site of the Mozilla
       Foundation (FireFox's developer).
       In [156]October 2006, TechWeb noted that Firefox was continuing to grow,
       citing [157]MarketShare's report Firefox had continued to grow - it was now
       at 12.46% market share as of September 2006 among all browsers for
       general-purpose browsing (up from 11.84% the previous month).
       [158]InformationWeek reported on January 16, 2007 that Firefox's market share
       was continuing to climb after IE 7's release.
       The growth of FLOSS web browsers becomes even more impressive when home users
       are specifically studied. Home users can choose which browser to use, while
       many businesses users cannot choose their web browser (it's selected by the
       company, and companies are often slow to change). XitiMonitor surveyed a
       sample of websites used on a Sunday (March 6, 2005), totalling 16,650,993
       visits. By surveying Sunday, they intended to primarily find out what people
       choose to use. Of the German users, an astonishing 21.4% were using Firefox.
       The other countries surveyed were France (12.2%), England (10.9%), Spain
       (9%), and Italy (8.6%). Here is [159]the original XitiMonitor study of
       2005-03-06, an [160]automated translation of the XitiMonitor study, and
       [161]a blog summary of the XitiMonitor study observing that, "Web sites
       aiming at the consumer have [no] other choice but [to make] sure that they
       are compatible with Firefox ... Ignoring compatibility with Firefox and other
       modern browsers does not make sense business-wise".
       Using this data, we can determine that 13.3% of European home users were
       using Firefox on this date in March 2005. How do can get such a figure? Well,
       we can use these major European countries as representatives of Europe as a
       whole; they're certainly representative of western Europe, since they're the
       most populous countries. Presuming that the vast majority of Sunday users are
       home users is quite reasonable for Europe. We can then make the reasonable
       presumption that the number of web browser users is proportional to the
       general population. Then we just need to get the countries' populations; I
       used the [162]CIA World Fact Book updated to 2005-02-10. These countries'
       populations (in millions) are, in the same order as above, 82, 60, 60, 40,
       and 58; calculating (21.4%*82 + 12.2%*60 + 10.9%*60 + 9%*40 + 8.6%*58) /
       (82+60+60+40+58) yields 13.3%.
       Among leading-edge indicators such as the technically savvy and web
       developers, the market penetration has been even more rapid and widespread.
       In one case (Ars Technica), Firefox has become the leading web browser! This
       is a leading indicator because these are the people developing the web sites
       you'll see tomorrow; in many cases, they've already switched to FLOSS web
       browsers such as Firefox. W3schools is a site dedicated to aiding web
       developers, and as part of their role track the browsers that web developers
       use. [163]W3schools found a dramatic shift from July 2003 to September 2004,
       with IE dropping from 87.2% to 74.8% while Gecko-based browsers (including
       Netscape 7, Mozilla, and Firefox) rising from 7.2% to 19%. ([164] W3Schools'
       current statistics are available). This trend has continued; as of March 2005
       Firefox was still growing in market share, having grown to 21.5% (with an
       increase every month), while IE was shrinking quickly (IE 6 was down to 64.0%
       and decreasing every month). [165]CNN found that among its CNET News.com
       readers, site visitors with FLOSS browsers jumped up from 8% in January 2004
       to 18% by September 2004. Statistics for [166]Engadget.com, which has a
       technical audience, found that as of September 2004, only 57% used a MS
       browser and Firefox had rapidly risen to 18%. IT pundits such as [167]PC
       Magazine's John C. Dvorak reported even more dramatic slides, with IE
       dropping to 50% share. [168]InformationWeek reported that on March 30, 2005,
       22% of visitors used Firefox, versus 69% who used Internet Explorer. The
       technical website [169]Ars Technica reported on March 27, 2005, that Firefox
       was now their #1 browser at 40%, while IE was down to #2 at 30% (vs. 38% in
       September 2004).
       Bloggers, another group of especially active web users (and thus, I believe,
       another leading indicator) also suggest this is a trend.
       [170]InformationWeek's March 30, 2005 article "Firefox Thrives Among
       Bloggers" specifically discussed this point. InformationWeek reported that on
       Boing Boing, one of the most popular blog sites, March 2005 statistics show
       that more of their users use Firefox than Internet Explorer: 35.9% of its
       visitors use Firefox, compared with 34.5% using Internet Explorer. I checked
       Boing Boing's April 2, 2004 statistics; they reported Firefox at 39.1%, IE at
       33.8%, Safari at 8.8%, and Mozilla at 4.1%; this means that Firefox plus
       Mozilla was at 43.2%, significantly beyond IE's 33.8%. Between January 1
       though March 9, the Technometria blog found that "Firefox accounted for 28%
       of browsers compared with 58% for Internet Explorer". Kottke.org reported on
       February 27 that 41% of visitors used Mozilla-based browsers (such as
       Firefox), while 31% used Internet Explorer.
       [171]Net Applications' tracking of web browser market share found that
       Firefox' market share has been growing, reaching 10.05% by March 2006. News
       sources, such as [172]ComputerWorld and [173]InformationWeek, trumpeted this
       news; 10% of all web browsers (and growing) is such a large market that it's
       now considered risky for developers to ignore Firefox.
       [174]OneStat.com's statistics of July 9, 2006 show increasing Firefox use.
       They found that global Firefox market share had stabilized for a little
       while, and then rapidly grown again. Their statistics found that globally
       Mozilla Firefox had 12.93% (compared to IE's 83.05%), and that it varied
       considerably by country. In the U.S., Firefox was at 15.82% (compared to IE
       79.78%), while in Germany Firefox had 39.02% (compared to IE's 55.99%).
       These increasing market share statistics are in spite of data-gathering
       problems that under-report FLOSS browsers. [175]Some non-IE browsers are
       configured to lie and use the same identification string as Internet
       Explorer, even though they aren't actually IE. Thus, all of these studies are
       almost certainly understating the actual share of non-IE browsers, though the
       amount of understatement is generally unknown.
       In short, efforts such as the grassroots [176]Spread Firefox marketing group
       seem to have been very effective at convincing people to try out the FLOSS
       web browser Firefox. Once people try it, they appear to like it enough to
       continue using it. [177]Mitchell Baker and the Firefox Paradox by David H.
       Freedman (Inc.com magazine) reviews the history and context of Firefox. They
       had set the absurdly ambitious goal of a million downloads within 10 days of
       release in November 2004; they reached that in only 4 days, and had 10
       million downloads within 30 days. In only a year, Firefox was being
       downloaded an average of 250,000 times per day. He concludes that Mozilla
       "may be the hottest tech company in America".
       The [178]Wikipedia article on usage share of web browsers (May 30, 2011
       version) summarizes web browser market share data from a variety of sources,
       specifically Net Applications, StatCounter, W3Counter, and Wikipedia itself.
       They found that the shares, averaged worldwide, were Internet Explorer 43.2%,
       Mozilla Firefox 28.6%, Google Chrome 14.6%, Safari 6.3%, Opera 2.6%, and
       other Mobile browsers 4.7%; both Mozilla Firefox and Google Chrome are FLOSS,
       so at least 43.2% of web browser users are using FLOSS tools. This varies by
       region, for example, in Europe Firefox is the most popular web browser, and
       FLOSS as a whole has a commanding lead.
    3. Linux-based Android smart phones have become a powerful market force. In a
       3-month period ending November 2010 in the U.S., the market shares were RIM
       33.5% (fell 4.1%), Android 26% (grew 6.4%), Apple 25% (grew less than 1%),
       Microsoft 9% (fell 1.8%), Palm 3.9% (fell 0.7%), per [179]Comscore as
       reported by InformationWeek 2011. [180]Gartner and [181]Nielson have also
       posted market share data showing the growth of Android.
    4. GNU/Linux is the #2 web serving OS on the public Internet (counting by
       physical machine), according to a study by Netcraft surveying March and June
       2001. Some of [182]Netcraft's surveys have also included data on OSes; two
       2001 surveys (their [183]June 2001 and [184]September 2001 surveys) found
       that GNU/Linux is the #2 OS for web servers when counting physical machines
       (and has been consistently gaining market share since February 1999). As
       Netcraft themselves point out, the usual Netcraft web server survey
       (discussed above) counts web server host names rather than physical
       computers, and so it doesn't measure such things as the installed hardware
       base. Companies can run several thousand web sites on one computer, and most
       of the world's web sites are located at hosting and co-location companies.
       Therefore, Netcraft developed a technique that indicates the number of actual
       computers being used as Web servers, together with the OS and web server
       software used (by arranging many IP addresses to reply to Netcraft
       simultaneously and then analyzing the responses). This is a statistical
       approach, so many visits to the site are used over a month to build up
       sufficient certainty. In some cases, the OS detected is that of a "front"
       device rather than the web server actually performing the task. Still,
       Netcraft believes that the error margins world-wide are well within the order
       of plus or minus 10%, and this is in any case the best available data.
       Before presenting the data, it's important to explain Netcraft's system for
       dating the data. Netcraft dates their information based on the web server
       surveys (not the publication date), and they only report OS summaries from an
       earlier month. Thus, the survey dated "June 2001" was published in July and
       covers OS survey results of March 2001, while the survey dated "September
       2001" was published in October and covers the operating system survey results
       of June 2001.
       Here's a summary of Netcraft's study results:

   OS group Percentage (March) Percentage (June) Composition
   Windows 49.2% 49.6% Windows 2000, NT4, NT3, Windows 95, Windows 98
   [GNU/]Linux 28.5% 29.6% [GNU/]Linux
   Solaris 7.6% 7.1% Solaris 2, Solaris 7, Solaris 8
   BSD 6.3% 6.1% BSDI BSD/OS, FreeBSD, NetBSD, OpenBSD
   Other Unix 2.4% 2.2% AIX, Compaq Tru64, HP-UX, IRIX, SCO Unix, SunOS 4 and others
   Other non-Unix 2.5% 2.4% MacOS, NetWare, proprietary IBM OSes
   Unknown 3.6% 3.0% not identified by Netcraft OS detector
       Much depends on what you want to measure. Several of the BSDs (FreeBSD,
       NetBSD, and OpenBSD) are FLOSS as well; so at least a part of the 6.1% for
       BSD should be added to GNU/Linux's 29.6% to determine the percentage of FLOSS
       OSes being used as web servers. Thus, it's likely that approximately
       one-third of web serving computers use FLOSS OSes. There are also regional
       differences, for example, GNU/Linux leads Windows in Germany, Hungary, the
       Czech Republic, and Poland.
       Well-known web sites using FLOSS include [185]Google (GNU/Linux) and
       [186]Yahoo (FreeBSD).
       If you really want to know about the web server market breakdown of "Unix vs.
       Windows," you can find that also in this study. All of the various Windows
       OSes are rolled into a single number (even Windows 95/98 and Windows
       2000/NT4/NT3 are merged, although they are fundamentally very different
       systems). Merging all the Unix-like systems in a similar way produces a total
       of 44.8% for Unix-like systems (compared to Windows' 49.2%) in March 2001.
       Note that these figures would probably be quite different if they were based
       on web addresses instead of physical computers; in such a case, the clear
       majority of web sites are hosted by Unix-like systems. As stated by Netcraft,
       "Although Apache running on various Unix systems runs more sites than
       Windows, Apache is heavily deployed at hosting companies and ISPs who strive
       to run as many sites as possible on one computer to save costs".
    5. GNU/Linux is the #1 server OS on the public Internet (counting by domain
       name), according to a 1999 survey of primarily European and educational
       sites. The first study that I've found that examined GNU/Linux's market
       penetration is a survey by [187]Zoebelein in April 1999. This survey found
       that, of the total number of servers deployed on the Internet in 1999
       (running at least ftp, news, or http (WWW)) in a database of names they used,
       the #1 OS was GNU/Linux (at 28.5%), with others trailing. It's important to
       note that this survey, which is the first one that I've found to try to
       answer questions of market share, used existing databases of servers from the
       .edu (educational domain) and the RIPE database (which covers Europe , the
       Middle East, parts of Asia, and parts of Africa), so this isn't really a
       survey of "the whole Internet" (e.g., it omits ".com" and ".net"). This is a
       count by domain name (e.g., the text name you would type into a web browser
       for a location) instead of by physical computer, so what it's counting is
       different than the Netcraft June 2001 OS study. Also, this study counted
       servers providing ftp and news services (not just web servers).
       Here's how the various OSes fared in the study:

    Operating System Market Share Composition
    GNU/Linux               28.5% GNU/Linux
    Windows                 24.4% All Windows combined (including 95, 98, NT)
    Sun                     17.7% Sun Solaris or SunOS
    BSD                     15.0% BSD Family (FreeBSD, NetBSD, OpenBSD, BSDI, ...)
    IRIX                     5.3% SGI IRIX

       A part of the BSD family is also FLOSS, so the FLOSS OS total is even higher;
       if over 2/3 of the BSDs are FLOSS, then the total share of FLOSS would be
       about 40%. Advocates of Unix-like systems will notice that the majority
       (around 66%) were running Unix-like systems, while only around 24% ran a
       Microsoft Windows variant.
    6. GNU/Linux was the #2 server OS sold in 1999, 2000, and 2001. According to
       [188]a June 2000 IDC survey of 1999 licenses, 24% of all servers (counting
       both Internet and intranet servers) installed in 1999 ran GNU/Linux. Windows
       NT came in first with 36%; all Unixes combined totaled 15%. Again, since some
       of the Unixes are FLOSS systems (e.g., FreeBSD, OpenBSD, and NetBSD), the
       number of FLOSS systems is actually larger than the GNU/Linux figures. Note
       that it all depends on what you want to count; 39% of all servers installed
       from this survey were Unix-like (that's 24%+15%), so "Unix-like" servers were
       actually #1 in installed market share once you count GNU/Linux and Unix
       together.
       IDC released a similar study on January 17, 2001 titled [189]"Server
       Operating Environments: 2000 Year in Review". On the server, Windows
       accounted for 41% of new server OS sales in 2000, growing by 20% - but
       GNU/Linux accounted for 27% and grew even faster, by 24%. Other major Unixes
       had 13%.
       [190]IDC's 2002 report found that Linux held its own in 2001 at 25%. All of
       this is especially intriguing since GNU/Linux had 0.5% of the market in 1995,
       [191]according to a Forbes quote of IDC. Data such as these (and the TCO data
       shown later) have inspired statements such as this one from IT-Director on
       November 12, 2001: [192]"Linux on the desktop is still too early to call, but
       on the server it now looks to be unstoppable".
       These measures do not measure all server systems installed that year; some
       Windows systems are copies that have not been paid for (sometimes called
       pirated software), and FLOSS OSes such as GNU/Linux and the BSDs are often
       downloaded and installed on multiple systems (since it's legal and free to do
       so).
       Note that [193]a study published October 28, 2002 by the IT analyst company
       Butler Group concluded that on or before 2009, Linux and Microsoft's .Net
       will have fully penetrated the server OS market from file and print servers
       through to the mainframe.
    7. GNU/Linux and Windows systems (when Windows CE and XP are combined) are the
       leaders and essentially even in terms of developer use for future embedded
       projects, according to Evans Data Corporation (EDC). [194]Their Embedded
       Systems Developer Survey, fielded in July 2002, asked developers "For each of
       the following operating systems, please indicate whether you are targeting
       the OS on your current project or your next project". They collected data
       from 444 developers. Their results: 30.2% of embedded developers use or
       expect to use Linux, while 16.2% say they will use Windows CE and another
       14.4% say they will use Windows XP Embedded. If the two Windows systems are
       combined, this gives Windows Embedded operating systems a statistically
       insignificant edge over Embedded Linux (at 30.6% vs. 30.2%). However,
       Embedded Linux has nearly double the growth rate, and combining two different
       Windows systems into a single value is somewhat misleading. Wind River's
       VxWorks embedded OS, the current embedded software market leader, "trails
       slightly behind Embedded Linux for current project use, and VxWorks' modest
       gain of just 2.9% for expected use in future projects drops it to a distant
       third place position, ending up with less than half the usage rate of the two
       neck-and-neck future project usage leaders (Windows Embedded and Embedded
       Linux)".
    8. An Evans Data survey published in November 2001 found that 48.1% of
       international developers and 39.6% of North Americans plan to target most of
       their applications to GNU/Linux. In October 2002, they found that 59% of
       developers expect to write Linux applications in the next year. The
       [195]November 2001 edition of the Evans Data International Developer Survey
       Series reported on in-depth interviews with over 400 developers representing
       over 70 countries, and found that when asked which OS they plan to target
       with most of their applications next year, 48.1% of international developers
       and 39.6% of North Americans stated that they plan to target most of their
       applications to GNU/Linux. This is surprising since only a year earlier less
       than a third of the international development community was writing GNU/Linux
       applications. The survey also found that 37.8% of the international
       development community and 33.7% of North American developers have already
       written applications for GNU/Linux, and that over half of those surveyed have
       enough confidence in GNU/Linux to use it for mission-critical applications.
       [196]Evans Data conducted a survey in October 2002. In this survey, they
       reported "Linux continues to expand its user base. 59% of survey respondents
       expect to write Linux applications in the next year".
    9. An IBM-sponsored study on Linux suggested that GNU/Linux has "won" the server
       war as of 2006, as 83% were using GNU/Linux to deploy new systems versus only
       23% for Windows. The November 9, 2006 article [197]The war is over and Linux
       won by Dana Blankenhorn summarizes a new IBM-sponsored study. IBM determined
       that 83% of companies expect to support new workloads on Linux next year,
       against 23% for Windows. He noted, "Over two-thirds of the respondents said
       they will increase their use of Linux in the next year, and almost no one
       said the opposite".
   10. Half of all mission-critical business applications are expected to run on
       GNU/Linux by 2012 [198]A survey of IT directors, vice presidents and CIOs
       carried out by Saugatuck Research, reported in January 2007, suggests that
       nearly half of all companies will be running mission-critical business
       applications on Linux in five years' time.
   11. An Evans Data survey made public in February 2004 found that 1.1 million
       developers in North America were working on FLOSS projects. [199]Evans Data's
       North American Developer Population Study examined the number of software
       developers using various approaches. It found that more than 1.1 million
       developers in North America were spending at least some of their time working
       on Open Source development projects. That's an extraordinarily large number
       of people, and it doesn't even account for developers in other countries.
       Many only develop part-time, but that many people can develop a lot of
       software, and having a large number of people increases the likelihood of
       helpful insights and innovations in various FLOSS projects.
   12. A 2004 InformationWeek survey found that 67% of companies use FLOSS products,
       with another 16% expecting to use it in 2005; only 17% have no near-term
       plans to support FLOSS products. The November 1, 2004 InformationWeek article
       [200]Open-Source Software Use Joins The Mix by Helen D'Antoni reported the
       results from InformationWeek Research, which measured adoption of
       "open-source architecture" and found that adoption is widespread. The survey
       also found other interesting results: "In general, companies don't view
       open-source software as risky. It often functions alongside [proprietary] and
       internally developed software, and because of this acceptance, open-source
       code is being used more broadly. Its use is evolving as companies look for
       cost-effective ways to manage software expenses". Of those companies using
       FLOSS, they found that 42% of companies implement production database
       operations using FLOSS, with 33% more considering it; only 25% are not using
       or considering FLOSS for production database use.
   13. A Japanese survey found widespread use and support for GNU/Linux; overall use
       of GNU/Linux jumped from 35.5% in 2001 to 64.3% in 2002 of Japanese
       corporations, and GNU/Linux was the most popular platform for small projects.
       The book [201]Linux White Paper 2003 (published by Impress Corporation)
       surveys the use of GNU/Linux in Japan (it is an update to an earlier book,
       "Linux White Paper 2001-2002"). This is written in Japanese; here is a brief
       summary of its contents.
       The survey has two parts, user and vendor. In "Part I : User enterprise",
       they surveyed 729 enterprises that use servers. In "Part II : Vendor
       enterprise", they surveyed 276 vendor enterprises who supply server
       computers, including system integrators, software developers, IT service
       suppliers, and hardware resellers. The most interesting results are those
       that discuss the use of Linux servers in user enterprises, the support of
       Linux servers by vendors, and Linux server adoption in system integration
       projects.
       First, the use of Linux servers in user enterprises:

                                    System          2002 2001
                                 Linux server      64.3% 35.5%
                             Windows 2000 Server   59.9% 37.0%
                              Windows NT Server    64.3% 74.2%
                            Commercial Unix server 37.7% 31.2%
       And specifically, here's the average use in 2002:

                           System         Ave. units      # samples
                        Linux server         13.4    N=429 (5.3 in 2001)
                    Windows 2000 Server      24.6           N=380
                     Windows NT Server       4.5            N=413
                   Commercial Unix server    6.9            N=233
       Linux servers are the fastest growing category from last year. The average
       units of server per enterprise increased by 2.5-fold from 5.3 units to 13.4
       units.
       Second, note the support of GNU/Linux servers by vendors:

                                 System         Year 2002 Support
                         Windows NT/2000 Server       66.7%
                              Linux server            49.3%
                         Commercial Unix server       38.0%
       This is the rate of vendors that develop or sale products supporting Linux
       server; note that Linux is already a major OS when compared with its
       competitors. The reasons for supporting Linux server were also surveyed,
       which turn out to be different than the reasons in some other counties (for a
       contrast, see the [202]European FLOSS report):

                        Increase of importance in the future 44.1%
                          Requirement from their customers   41.2%
                              Major OS in their market       38.2%
                                Free of licence fee          37.5%
                        Most reasonable OS for their purpose 36.0%
                                    Open source              34.6%
                                  High reliability           27.2%
       Third, note the rate of Linux server adoption in system integration projects:

                   Project Size (Million Yen)    Linux    Win2000 Unix
                                              2002  2001   2002   2002
                              0-3             62.7% 65.7%  53.8%  15.4%
                              3-10            51.5% 53.7%  56.3%  37.1%
                             10-50            38.3% 48.9%  55.8%  55.8%
                             50-100           39.0% 20.0%  45.8%  74.6%
                              100+            24.4% 9.1%   51.1%  80.0%
       Where 1 Million Yen = $8,000 US. GNU/Linux servers are No.1 (62.5%) in small
       projects less than 3,000,000 Yen ($24,000 US), and GNU/Linux has grown in
       larger projects more than 50,000,000 Yen ($400,000 US) from 20.0% to 39.0%.
       In projects over 100,000,000 Yen ($800,000 US), Linux is adopted by 24.4% of
       the projects (mainly as a substitute for proprietary Unix systems). Note that
       many projects (especially large ones) use multiple platforms simultaneously,
       so the values need not total 100%.
       Note that the Japanese [203]Linux white paper 2003 found that 49.3% of IT
       solution vendors support Linux in Japan.
   14. The European FLOSS study found significant use of FLOSS. The large report
       [204]Free/Libre and Open Source Software (FLOSS): Survey and Study, published
       in June 2002, examined many issues including the use of FLOSS. This study
       found significant variance in the use of FLOSS; 43.7% of German
       establishments reported using FLOSS, 31.5% of British establishments reported
       using FLOSS, while only 17.7% of Swedish establishments reported using FLOSS.
       In addition, they found that OSS usage rates of larger establishments were
       larger than smaller establishments, and that OSS usage rates in the public
       sector were above average.
   15. Microsoft sponsored its own research to "prove" that GNU/Linux is not as
       widely used, but this research has been shown to be seriously flawed.
       Microsoft sponsored a [205]Gartner Dataquest report claiming only 8.6% of
       servers shipped in the U.S. during the third quarter of 2000 were
       Linux-based. However, it's worth noting that Microsoft (as the research
       sponsor) has every incentive to create low numbers, and these numbers are
       quite different from IDC's research in the same subject. IDC's Kusnetzky
       commented that the likely explanation is that Gartner used a very narrow
       definition of "shipped"; he thought the number was "quite reasonable" if it
       only surveyed new servers with Linux, "But our research is that this is not
       how most users get their Linux. We found that just 10 to 15 percent of Linux
       adoption comes from pre-installed machines... for every paid copy of Linux,
       there is a free copy that can be replicated 15 times". Note that it's quite
       difficult to buy a new x86 computer without a Microsoft OS (Microsoft's
       contracts with computer makers ensure this), but that doesn't mean that these
       OSes are used. Gartner claimed that it used interviews to counter this
       problem, but its final research results (when compared to known facts)
       suggest that Gartner did not really counter this effect. For example, Gartner
       states that Linux shipments in the supercomputer field were zero. In fact,
       Linux is widely used on commodity parallel clusters at many scientific sites,
       including many high-profile sites. Many of these systems were assembled
       in-house, showing that Gartner's method of defining a "shipment" does not
       appear to correlate to working installations. The Register's article,
       [206]"No one's using Linux" (with its companion article [207]"90% Windows.".)
       discusses this further. In short, Microsoft-sponsored research has reported
       low numbers, but these numbers are quite suspect.
   16. Businesses plan to increase their use of GNU/Linux. A Zona Research study
       found that over half of the large enterprise respondents expected increases
       of up to 25% in the number of GNU/Linux users in their firm, while nearly 20%
       expected increases of over 50%. In small companies, over one third felt that
       GNU/Linux usage would expand by 50%. The most important factors identified
       that drove these decisions were reliability, lower price, speed of
       applications, and scalability. Here are the numbers:

      Expected GNU/Linux Use Small Business Midsize Business Large Business Total
           50% increase      21.0%          16%              19.0%          19%
         10-25% increase     30.5%          42%              56.5%          44%
            No growth        45.5%          42%              24.5%          36%
            Reduction        3.0%           0%               0%             1%

       You can see more about this study in [208]"The New Religion: Linux and Open
       Source" (ZDNet) and in InfoWorld's February 5, 2001 article "Linux lights up
       enterprise: But concerns loom about OS vendor profitability".
   17. The global top 1000 Internet Service Providers expect GNU/Linux use to
       increase by 154%, according to Idaya's survey conducted January through March
       2001. A [209]survey conducted by Idaya of the global top 1000 ISPs found that
       they expected GNU/Linux to grow a further 154% in 2001. Also, almost two
       thirds (64%) of ISPs consider the leading open source software meets the
       standard required for enterprise level applications, comparable with
       proprietary software. Idaya produces FLOSS software, so keep that in mind as
       a potential bias.
   18. A 2002 European survey found that 49% of CIOs in financial services, retail,
       and the public sector expect to be using FLOSS. OpenForum Europe published in
       February 2002 a survey titled [210]Market Opportunity Analysis For Open
       Source Software. Over three months CIOs and financial directors in financial
       services, retail and public sector were interviewed for this survey. In this
       survey, 37% of the CIOs stated that they were already using FLOSS, and 49%
       expected to be using FLOSS in the future. It is quite likely that even more
       companies are using FLOSS but their CIOs are not aware of it. Perceived
       benefits cited included decreased costs in general (54%), lower software
       license cost (24%), better control over development (22%), and improved
       security (22%).
   19. IBM found a 30% growth in the number of enterprise-level applications for
       GNU/Linux in the six month period ending June 2001. At one time, it was
       common to claim that "Not enough applications run under GNU/Linux" for
       enterprise-level use. However, [211]IBM found there are over 2,300 GNU/Linux
       applications (an increase in 30% over 6 months) available from IBM and the
       industry's top independent software vendors (ISVs). A [212]Special report by
       Network Computing on Linux for the Enterprise discusses some of the strengths
       and weaknesses of GNU/Linux, and found many positive things to say about
       GNU/Linux for enterprise-class applications.
   20. Morgan Stanley found significant and growing use of GNU/Linux. [213]They
       surveyed 225 CIOs on August 2002, and among the respondents, 29% said they
       owned GNU/Linux servers, 8% did not but are formally considering buying them,
       and 17% of the CIOs said they neither owned nor were formally considering
       GNU/Linux servers but that they were informally considering them. The
       remainder (slightly less than half, or 46%) noted they didn't own and weren't
       considering GNU/Linux. For those that have recently purchased new GNU/Linux
       servers, 31% were adding capacity, 31% were replacing Windows systems, 24%
       were replacing Unix and 14% were replacing other OSes. It's easier to
       transition to GNU/Linux from Unix than from Windows, so it's intriguing that
       Windows was being replaced more often than Unix. [214]CNet news commented on
       this study with additional commentary about open source vs. Microsoft.
   21. Revenue from sales of GNU/Linux-based server systems increased 90% in the
       fourth quarter of 2002 compared to the fourth quarter of 2001. This 90%
       increase compared sharply with the 5% increase of server market revenue
       overall. This data was determined by Gartner Dataquest, and [215]reported in
       C|Net.
       [216]Sales of GNU/Linux servers increased 63% from 2001 to 2002. This is an
       increase from $1.3 billion to $2 billion, according to Gartner.
   22. [217]In a survey of business users by Forrester Research Inc., 52% said they
       are now replacing Windows servers with Linux. Business Week quoted this
       survey in a January 2005 article, noting that GNU/Linux is forcing Microsoft
       to offer discounts to avoid losing even more sales.
   23. A 2001 survey found that 46.6% of IT professionals were confident that their
       organizations could support GNU/Linux, a figure larger than any OS except
       Windows. A [218]TechRepublic Research survey titled Benchmarks, Trends, and
       Forecasts: Linux Report found that "support for Linux runs surprisingly deep"
       when it surveyed IT professionals and asked them how confidently their
       organizations could support various OSes. Given Windows' market dominance on
       the desktop, it's not surprising that most were confident that their
       organizations could support various versions of Windows (for Windows NT the
       figure was 90.6%; for Windows 2000, 81.6%). However, GNU/Linux came in third,
       at 46.4%; about half of those surveyed responded that their organizations
       were already confident in their ability to support GNU/Linux! This is
       especially shocking because GNU/Linux beat other well-known products with
       longer histories including Unix (42.1%), Novell Netware (39.5%), Sun Solaris
       (25.7%), and Apple (13.6%). TechRepublic suggested that there are several
       possible reasons for this surprisingly large result:
          + GNU/Linux is considered to be a rising technology; many IT professionals
            are already studying it and learning how to use it, assuming that it
            will be a marketable skill in the near future.
          + Many IT professionals already use GNU/Linux at home, giving GNU/Linux an
            entree into professional organizations.
          + Since GNU/Linux is similar to Unix, IT professionals who are proficient
            in Unix can easily pick up GNU/Linux.
       TechRepublic suggests that IT executives should inventory their staff's skill
       sets, because they may discover that their organization can already support
       GNU/Linux if they aren't currently using it.
   24. Sendmail, an FLOSS program, is the leading email server, per surveys by D.J.
       Bernstein. A [219]survey between 2001-09-27 and 2001-10-03 by D.J. Bernstein
       of one million random IP addresses successfully connected to 958 SMTP (email)
       servers (such servers are also called mail transport agents, or MTAs).
       Bernstein found that Unix Sendmail had the largest market share (42% of all
       email servers), followed by Windows Microsoft Exchange (18%), Unix qmail
       (17%), Windows Ipswitch IMail (6%), Unix smap (2%), UNIX Postfix (formerly
       VMailer, 2%) and Unix Exim (1%). Note that Bernstein implements one of
       Sendmail's competitors (qmail), so he has a disincentive to identify
       Sendmail's large market share. At the time qmail was not FLOSS, because
       [220]modified derivatives of Qmail could not be freely redistributed (without
       express permission by the author). Qmail was "source viewable,", so some
       people were confused into believing that Qmail was FLOSS. Since then,
       [221]qmail has been released to the public domain and thus FLOSS. However,
       Sendmail, Postfix, and Exim were all FLOSS at the time. Indeed, not only is
       the leading program (Sendmail) FLOSS, but that FLOSS program has more than
       twice the installations of its nearest competition.
   25. MailChannel's survey (published 2007) showed that the top two email servers
       (Sendmail and Postfix) are FLOSS programs. [222]Fingerprinting the World's
       Mail Servers described a different survey approach: To avoid including
       spammers, they first started with a list of 400,000 companies worldwide, and
       then determined what their external email server software was. They even sent
       erroneous commands to double-check their results (different servers produced
       different results). The most popular two email servers were Sendmail (12.3%)
       and Postfix (8.6%). This was followed by Postini (8.5%), Microsoft Exchange
       (7.6%), MXLogic (6.0%), qmail (5.3%), and Exim (5.0%).
   26. A survey in the second quarter of 2000 found that 95% of all reverse-lookup
       domain name servers (DNS) used bind, an FLOSS product. The Internet is built
       from many mostly-invisible infrastructure components. This includes domain
       name servers (DNSs), which take human-readable machine names (like
       "yahoo.com") and translate them into numeric addresses. Publicly accessible
       machines also generally support "reverse lookups", which convert the numbers
       back to names; for historical reasons, this is implemented using the hidden
       "in-addr.arpa" domain. By surveying the in-addr domain, you can gain insight
       into how the whole Internet is supported. [223]Bill Manning has surveyed the
       in-addr domain and found that 95% of all name servers (in 2q2000) performing
       this important Internet infrastructure task are some version of "bind". This
       includes all of the [224]DNS root servers, which are critical for keeping the
       Internet functioning. Bind is an FLOSS program.
   27. A survey in May 2004 found that over 75% of all DNS domains are serviced by
       an FLOSS program. [225]Don Moore's DNS Server Survey completed May 23, 2004
       surveyed DNS servers. He found that BIND (an FLOSS program) serviced 70.105%
       of all domains, followed by TinyDNS (15.571%), Microsoft DNS Server (6.237%),
       MyDNS (2.792%), PowerDNS (1.964%), SimpleDNS Plus (1.25%), unknown (1.138%),
       and the Pliant DNS Server (0.277%), with many others trailing. Since BIND,
       MyDNS, PowerDNS, and Pliant are all FLOSS, FLOSS programs service 75.138% of
       all DNS domains. The figures are different if you count per-installation
       instead of per-domain, but FLOSS still dominates. Counting per-platform, we
       have BIND (72.598%), Microsoft (21.711%), TinyDNS (2.587%), unknown (1.041%),
       Simple DNS Plus (0.922%), MyDNS (0.314%), PowerDNS (0.26%). Totalling BIND,
       MyDNS, and PowerDNS produces the trivially smaller figure of 73.172%
       supported by DNS. This difference in figures shows that about 3 out of 4
       organizations choose the FLOSS BIND when installing a DNS server, and the 1
       in 4 who don't and then choose Microsoft tend to be those supporting fewer
       domains (otherwise the Microsoft count of domains would be larger). In any
       case, given the critical nature of DNS to the Internet, it's clear that FLOSS
       is a critical part of it.
   28. PHP is the web's #1 Server-side Scripting Language. PHP, a recursive acronym
       for "PHP: Hypertext Preprocessor", is an open source server-side scripting
       language designed for creating dynamic Web pages (e.g., such as e-commerce).
       [226]As noted in a June 3, 2002 article, PHP recently surpassed Microsoft's
       ASP to become the most popular server-side Web scripting technology on the
       Internet, and was used by over 24% of the sites on the Internet. Of the 37.6
       million web sites surveyed worldwide, PHP is running on over 9 million sites,
       and over the years 2000 through 2002 PHP has averaged a 6.5% monthly growth
       rate. Since that time, [227]PHP has continued to be widely used. (The rates
       increased through 2003-2003, and then declined slightly, though this is
       probably due to the many alternative technologies available, such as Python
       and Ruby.)
   29. OpenSSH is the Internet's #1 implementation of the SSH security protocol. The
       Secure Shell (SSH) protocol is widely used to securely connect to computers
       and control them remotely (using either a text or X-Windows graphical
       interface). On April 2002, a survey of 2.4 million Internet addresses found
       that OpenSSH, an FLOSS implementation of SSH, was the #1 implementation, with
       66.8% of the market; the proprietary "SSH" had 28.1%, Cisco had 0.4%, and
       others totaled 4.7%. By September 2004, OpenSSH had grown to a dominant 87.9%
       share. You can see [228]general information about the survey, the
       [229]specific SSH statistics for April 2002, and [230]specific SSH statistics
       for September 2004. It's also interesting to note that OpenSSH had less than
       5% of the market in the third quarter of 2000, but its use steadily grew. By
       the fourth quarter of 2001, over half of all users of the SSH protocol were
       using OpenSSH, and its market share has continued to grow since.
   30. CMP TSG/Insight found that 41% of application development tools were FLOSS,
       and VARBusiness found 20% of all companies using GNU/Linux. [231]VARBusiness
       reported in September 2003 on "The Rise of Linux". In the article, it reports
       a finding of CMP TSG/Insight: 41% of application development tools in use
       were FLOSS, second only to Microsoft (76%) and leading Oracle (35%), IBM
       (26%), Sun (21%), and Borland (18%). They also reported their own finding
       that 20% of all companies they surveyed were GNU/Linux, presumably less than
       that of Microsoft, but twice that of Netware and Unix. Indeed, they note that
       GNU/Linux has transformed "from a curiosity to a core competency".
   31. MySQL's market share is growing faster than Windows'. [232]An Evans Data
       survey released in January 2004 found that the use of FLOSS database MySQL
       grew 30% over the year, vs. 6% for Microsoft's SQL Server and Access
       databases, according to a survey of 550 developers. Microsoft still has a far
       greater total market share in the database development market, but Evans Data
       reported that FLOSS's "price and its ability to integrate with other software
       mesh well with the priorities of application developers" and that "Concerns
       over stability, expense and how well a database plays with others are leading
       a quickly growing number of...companies to seriously consider and implement
       an open source database solution". Evans Data noted that "We expect this
       trend to continue as the open source offerings are continually improved
       upon".
   32. As of 2004, a CSC study determined that an astonishing 14% of the large
       enterprise office systems market are using FLOSS OpenOffice.org. Consulting
       firm Computer Sciences Corp. (CSC) unsurprisingly found that Microsoft
       dominates the office suite market, with 95% of the overall share and more
       than 300 million users worldwide. But surprisingly, they found that [233]the
       FLOSS OpenOffice.org has secured 14% of the large enterprise office systems
       market, with over 16 million downloads and countless CD installations.
   33. A February 2005 survey of developers and database administrators found that
       64% use an Open Source database. [234]Evans Data Corp.'s "Winter 2005
       Database Development Survey" of developers and database administrators
       (DBAs), released February 2005, found a strong increase in use of a variety
       of FLOSS databases throughout corporate U.S. Evans found 64% (about
       two-thirds) use Open Source databases (up from 58% the previous year), and
       over 50% use (or plan to use) XQuery and other open web services standards
       with their data -- Open Source or proprietary.
       Two key factors seem to driving this rise: survey respondents indicated that
       FLOSS databases are increasing their performance and scalability to the point
       where they are acceptable for use in corporate enterprise environments, and
       many organizations have tight IT and database development budgets. Evans
       found that MySQL, PostgreSQL, and Firebird were popular FLOSS databases.
       Evans found FireBird is the most used database among all database programs
       for `edge' applications, with Microsoft Access as a close second (at 21%). In
       addition, MySQL and FireBird are locked in a virtual tie in the FLOSS
       database space; each are used by just over half of database developers who
       use FLOSS databases.
   34. BusinessWeek reports that hardware companies are selling more than $1 billion
       in servers to run Linux every quarter. [235]BusinessWeek's article "Torvalds'
       Baby Comes of Age" (October 3, 2005) reports that hardware companies are
       selling "more than $1 billion in servers to run Linux every quarter, while
       sales of servers running proprietary software continue to fall". They note
       that, according to market research company IDC, "Linux is now commonplace on
       big corporate servers -- posting 11 consecutive quarters of growth". They
       also quote IBM stating that 10 million desktops ran Linux in 2004, by their
       figures a 40% jump from a year ago.
   35. InformationWeek's February 2005 survey reported significant use of GNU/Linux,
       and that that 90% of companies anticipate a jump in server licenses for
       GNU/Linux. [236]InformationWeek Research Brief "Linux Outlook" published
       February 2005 found that the "open-source movement is growing" and that,
       given the trends, the expected outcome is "Increased use of Linux and
       open-source software [and] a decline in the use of Windows NT, 2000, and XP.
       Two years ago a major hurdle in the use of Linux was reliable support and
       service, but no more". Their survey was conducted in January 2005, surveying
       439 business technology professionals. They found that "Open-source products
       are most commonly deployed on server operating systems, Web server
       applications, application development tools, and application servers". Four
       out of five sites use GNU/Linux on Web or Intranet servers. More
       specifically, when asked "In what areas is your organization using Linux",
       the the top areas where GNU/Linux is used include server operating systems
       (75%), web server applications (75%), application development tools (68%),
       application server (56%), and desktop/laptop operating system (47%). In the
       next 12 months, Linux is expected to replace Windows NT or Windows 2000
       servers at nearly half of the sites we surveyed. Three in five sites expect
       to use Linux on servers instead of Windows NT or Windows 2000, and in fact,
       "nearly 90% of companies surveyed anticipate a jump in server licenses for
       Linux. No other product comes close to these expectations -- not Windows,
       Macintosh or Unix". The top Linux distributions (in order) were Red Hat,
       Novell/SuSE, and Debian.
       Why so much use? "Low cost and the lack of licensing fees are the primary
       reasons [77%] why companies deploy Linux on PCs and servers... However,
       concern about the vulnerability of Microsoft products is also speeding up
       Linux adoption. Of the sites using Linux on PCs, 73% are doing so in response
       to Windows security issues while 69% seek an alternative to Windows.
       Two-thirds of sites state that Windows security concerns are driving Linux
       adoption on servers while nearly three in five server users want another
       option to Windows... Linux has its edge on Windows, with low cost, reliable
       performance, secure environment, expected future innovation and confidence in
       open-source development model". Looking at their numbers in more detail bears
       this out. For servers, the primary reasons for using Linux were relatively
       low cost (77%), reliability (74%), performance (73%), Windows security issues
       (65%), needing an alternative to Windows (59%), recommendations by technical
       staff (59%), development tools widely available (46%), ability to modify
       source code to meet needs (45%), fast software patches and bug fixes (41%),
       and fulfills company requirements or standards (40%). The primary reasons for
       using Linux on PCs were similar though with different relative weights:
       relatively low cost (75%), Windows security issues (73%), need an alternative
       to Windows (69%), reliability (60%), performance (52%), recommendations by
       technical staff (45%), fast software patches and bug fixed (44%), development
       tools widely available (42%), ability to modify source code to meet needs
       (36%), fulfills company requirements or standards (34%).
       InformationWeek does not predict that everyone will be using GNU/Linux in all
       circumstances within a few years; instead, they believe their data suggests
       that "A myriad of operating system platforms will continue to typify IT
       architecture in 2005 and beyond". And the report certainly does not have
       rose-colored sunglasses; it discusses some of the challenges that some users
       have had, too. But the report notes that in spite of this, "Linux is
       fulfilling the expectations of most users. Eighty-four percent of sites say
       they are highly satisfied with Linux-server performance [, and half of the
       sites reported] the same level of satisfaction on PCs". (my emphasis.)
   36. A 2007 survey claims that around half of all companies making embedded
       products are using Linux in them, with an increasing trend.
       [237]LinuxDevice.com's 2007 survey of companies creating embedded systems,
       47% were using Linux and 3.6% were using eCos in at least some of those
       products in the past 2 years. Both are FLOSS, so that's barely over half of
       all. The next largest was MS Windows at 12.3%, so Linux was clearly the
       leader. Even more interestingly, the developers expected that of the OSs in
       their companies embedded designs in the next 2 years, Linux would be used by
       59.3% and eCos 3.7%, so the trend is clearly up for FLOSS use. This survey is
       from LinuxDevice.com, so there's the risk of number-fudging, but their
       readership is actually broader than their name might suggest; in much earlier
       surveys, only a minority of companies were even considering using Linux in 2
       years. Still, that's an important caveat, and the the respondents are
       self-selected (which can often skew surveys). Still, it's of interest in
       showing that there is a growing trend of use, and it has other interesting
       results about embedded environments.
   37. Optaros, a consulting firm, reports that 87% of organizations are now using
       open-source software; BusinessWeek claims that this demonstrates that FLOSS
       has greatly expanded into businesses. [238]BusinessWeek's December 2005
       article "A Watershed for Open Source" reported that in 2005 "open source was
       the word on the lips of not just early adopters but of an early majority". In
       particular, the article noted that "CIOs signed off on open-source projects
       [and not just] low-level engineers... on their own initiative [, and] venture
       capitalists woke up to the new business opportunities of open source". They
       claimed the major events of 2005 were that Red Hat made lots of money from
       free software (this "observation" ignores the fact that one of the companies
       Red Hat bought, Cygnus, had been doing that for many years), Sun
       Microsystems' opening much of its software, Motorola bets big on mobile
       Linux, Firefox went mainstream, and venture capitalists invest in FLOSS (they
       estimate $400 million was invested in FLOSS startups in 2005). BusinessWeek
       used as one of its supports a study by Optaros, who reports that 87% of
       organizations are now using FLOSS. This estimate may be low; many FLOSS
       deployments are made by lower-level people solving specific problems. Since
       there's usually no requirement to report FLOSS use (there's no particular
       reason to do so in many cases), upper management is often not aware when
       they're using it... they just know that problems are getting solved.
   38. IDC's Spring 2006 survey found that developers around the world are
       increasing their use of FLOSS. As reported in [239]It's not just Linux: Open
       Source has arrived, IDC surveyed over 5,000 developers from 116 countries in
       the spring of 2006. They found that FLOSS is "being used by 71% of the
       developers in the world and is in production at 54 percent of their
       organizations. In addition, half of the global developers claim that the use
       of open source is increasing in their organizations". Steven J.
       Vaughan-Nichols added that this report showed that "One way or the other,
       open-source methods and software are used almost everywhere... Open source is
       so pervasive that IDC declares in this study that open-source software
       represents the most significant all-encompassing and long-term trend that the
       software industry has seen since the early 1980s. IDC analysts also believe
       that open source will eventually play a role in the life-cycle of every major
       software category, and will fundamentally change the value proposition of
       packaged software for customers".
       Dr. Anthony Picardi, IDC's senior vice president of global software research,
       made some very interesting statements based on this study: "The use of open
       source beyond Linux is pervasive, used by almost three-quarters of
       organizations and spanning hundreds of thousands of projects... The real
       impact of open source is to sustain innovations in mature software markets,
       thus extending the useful life of software assets and saving customers
       money... As business requirements shift from acquiring new customers to
       sustaining existing ones, the competitive landscape will move towards costs
       savings and serving up sustaining innovations to savvy customers, along with
       providing mainstream software to new market segments that are willing to pay
       only a fraction of conventional software license fees," Picardi added. "Open
       source software is ultimately a resource for sustaining innovators".

   Perhaps the simplest argument that GNU/Linux has a significant market share (and
   that it's increasing) is that [240]Sun is modifying its Solaris product to run
   GNU/Linux applications, and IBM has already announced that GNU/Linux will be the
   successor of IBM's own AIX.

                                      3. Reliability

   There are a lot of anecdotal stories that FLOSS is more reliable, but finally
   there is quantitative data confirming that mature FLOSS programs are often more
   reliable:
    1. Equivalent FLOSS applications are more reliable, according to the Fuzz study.
       The paper [241]"Fuzz Revisited" paper measured reliability by feeding
       programs random characters and determining which ones resisted crashing and
       freeze-ups. This approach is unlikely to find subtle failures, yet the study
       authors found that their approach still manages to find many errors in
       production software and is a useful tool for finding software flaws. What's
       more, this approach is extremely fair and can be broadly applied to any
       program, making it possible to compare different programs fairly.

                        Failure Rates as Measured by Fuzz Tests
       [242]Failure rates as measured by Fuzz tests show that FLOSS was the most
                                       reliable
       FLOSS had higher reliability by this measure. It states in section 2.3.1
       that:

     It is also interesting to compare results of testing the commercial systems to
     the results from testing "freeware" GNU and Linux. The seven commercial
     systems in the 1995 study have an average failure rate of 23%, while Linux has
     a failure rate of 9% and the GNU utilities have a failure rate of only 6%. It
     is reasonable to ask why a globally scattered group of programmers, with no
     formal testing support or software engineering standards can produce code that
     is more reliable (at least, by our measure) than commercially produced code.
     Even if you consider only the utilities that were available from GNU or Linux,
     the failure rates for these two systems are better than the other systems.
       There is evidence that Windows applications have even less reliability than
       the proprietary Unix software (e.g., less reliable than the FLOSS software).
       A later paper published in 2000, [243]"An Empirical Study of the Robustness
       of Windows NT Applications Using Random Testing", found that with Windows NT
       GUI applications, they could crash 21% of the applications they tested, hang
       an additional 24% of the applications, and could crash or hang all the tested
       applications when subjecting them to random Win32 messages. Indeed, to get
       less than 100% of the Windows applications to crash, they had to change the
       conditions of the test so that certain test patterns were not sent. Thus,
       there's no evidence that proprietary Windows software is more reliable than
       FLOSS by this measure. Yes, Windows has progressed since that time - but so
       have the FLOSS programs.
       Although the FLOSS experiment was done in 1995, and the Windows tests were
       done in 2000, nothing that's happened since suggests that proprietary
       software has become much better than FLOSS programs since then. Indeed, since
       1995 there's been an increased interest and participation in FLOSS, resulting
       in far more "eyeballs" examining and improving the reliability of FLOSS
       programs.
       The fuzz paper's authors also found that proprietary software vendors
       generally didn't fix the problems identified in an earlier version of their
       paper (from 1990), and they found that concerning. There was a slight
       decrease in failure rates between their 1990 and 1995 paper, but many of the
       flaws they found (and reported) in the proprietary Unix programs were still
       not fixed 5 years later. In contrast, [244]Scott Maxwell led an effort to
       remove every flaw identified in the FLOSS software in the 1995 fuzz paper,
       and eventually fixed every flaw. Thus, the FLOSS community's response shows
       why, at least in part, FLOSS programs have such an edge in reliability; if
       problems are found, they're often fixed. Even more intriguingly, the person
       who spearheaded ensuring that these problems were fixed wasn't an original
       developer of the programs - a situation only possible with FLOSS.
       Now be careful: FLOSS is not magic pixie dust; beta software of any kind is
       still buggy! However, the 1995 experiment measured mature FLOSS to mature
       proprietary software, and the FLOSS software was more reliable under this
       measure.
    2. IBM studies found GNU/Linux highly reliable. [245]IBM ran a series of
       extremely stressful tests for 30 and 60 days, and found that the Linux kernel
       and other core OS components -- including libraries, device drivers, file
       systems, networking, IPC, and memory management -- operated consistently and
       completed all the expected durations of runs with zero critical system
       failures. Linux system performance was not degraded during the long duration
       of the run, the Linux kernel properly scaled to use hardware resources (CPU,
       memory, disk) on SMP systems, the Linux system handled continuous full CPU
       load (over 99%) and high memory stress well, and the Linux system handled
       overloaded circumstances correctly. IBM declared that these tests demonstrate
       that "the Linux kernel and other core OS components are reliable and stable
       ... and can provide a robust, enterprise-level environment for customers over
       long periods of time".
    3. GNU/Linux is more reliable than Windows NT, according to a 10-month ZDnet
       experiment. [246]ZDnet ran a 10-month test for reliability to compare Caldera
       Systems OpenLinux, Red Hat Linux, and Microsoft's Windows NT Server 4.0 with
       Service Pack 3. All three used identical (single-CPU) hardware, and network
       requests were sent to each server in parallel for standard Internet, file,
       and print services. The result: NT crashed an average of once every six
       weeks, each taking about 30 minutes to fix; that's not bad, but neither
       GNU/Linux server ever went down. This ZDnet article also does a good job of
       identifying GNU/Linux weaknesses (e.g., desktop applications and massive
       SMP). Hopefully Windows has made improvements since this study - but the
       FLOSS have certainly made improvements as well.
    4. GNU/Linux is more reliable than Windows NT, according to a one-year Bloor
       Research experiment. [247]Bloor Research had both OSes running on relatively
       old Pentium machines. During the one year test, GNU/Linux crashed once due to
       a hardware fault (disk problems), which took 4 hours to fix, giving it a
       measured availability of 99.95 percent. Windows NT crashed 68 times, caused
       by hardware problems (disk), memory (26 times), file management (8 times),
       and various odd problems (33 times). All this took 65 hours to fix, giving an
       availability of 99.26 percent. It's intriguing that the only GNU/Linux
       problem and many of the Windows problems were hardware-related; it could be
       argued that the Windows hardware was worse, or it could be argued that
       GNU/Linux did a better job of avoiding and containing hardware failures. The
       file management failure is due to Windows, and the odd problems appear due to
       Windows too, indicating that GNU/Linux is far more reliable than Windows.
       GNet summarized this as saying "the winner here is clearly Linux".
    5. A study by Reasoning found that the Linux kernel's implementation of the
       TCP/IP Internet protocol stack had fewer defects than the equivalent stacks
       of several proprietary general-purpose operating systems, and equalled the
       best of the embedded operating systems. As noted in [248]their press release
       and [249]C|Net, Reasoning's study compared six implementations of TCP/IP, the
       fundamental protocols underlying the Internet. Besides the Linux kernel,
       three of the implementations were part of commercial general-purpose
       operating systems, and two were embedded in commercial telecommunications
       equipment. The Linux kernel primarily used as the kernel of a general-purpose
       operating system; it would be reasonable to expect that the embedded
       operating systems would have better reliability because of the need for
       reliability in that market. The study was not commissioned by any of the
       GNU/Linux vendors or companies who might be competing with GNU/Linux, and
       thus should be free of bias.
       The company used automated tools to look five kinds of defects in code:
       Memory leaks, null pointer dereferences, bad deallocations, out of bounds
       array access and uninitialized variables. Reasoning found 8 defects in 81,852
       lines of Linux kernel source lines of code (SLOC), resulting in a defect
       density rate of 0.1 defects per KSLOC. In contrast, the three proprietary
       general-purpose operating systems (two of them versions of Unix) had between
       0.6 and 0.7 defects/KSLOC; thus the Linux kernel had a smaller defect rate
       than all the competing general-purpose operating systems examined. The rates
       of the two embedded operating systems were 0.1 and 0.3 defects/KSLOC, thus,
       the Linux kernel had an defect rate better than one embedded operating
       system, and equivalent to another.
       One issue is that the tool detects issues that may not be true problems. For
       example, of those 8 defects, one was clearly a bug and had been separately
       detected and fixed by the developers, and 4 defects clearly had no effect on
       the running code. None of the defects found were security flaws. To counter
       this, they also tracked which problems were repaired by the developers of the
       various products. The Linux kernel did quite well by this measure as well:
       the Linux kernel had 1 repaired defect out of 81.9 KSLOC, while the
       proprietary implementations had 235 repaired defects out of 568 KSLOC. This
       means the Linux kernel had a repair defect rate of 0.013 defects/KSLOC, while
       the proprietary implementations had a repair defect rate of 0.41
       defects/KSLOC.
       CEO Scott Trappe explained this result by noting that the open source model
       encourages several behaviors that are uncommon in the development of
       commercial code. First, many users don't just report bugs, as they would do
       with [proprietary] software, but actually track them down to their root
       causes and fix them. Second, many developers are reviewing each other's code,
       if only because it is important to understand code before it can be changed
       or extended. It has long been known that peer review is the most effective
       way to find defects. Third, the open source model seems to encourage a
       meritocracy, in which programmers organize themselves around a project based
       on their contributions. The most effective programmers write the most crucial
       code, review the contributions of others, and decide which of these
       contributions make it into the next release. Fourth, open source projects
       don't face the same type of resource and time pressures that [proprietary]
       projects do. Open source projects are rarely developed against a fixed
       time-line, affording more opportunity for peer review and extensive beta
       testing before release.
       This certainly doesn't prove that FLOSS will always be the highest quality,
       but it clearly shows that FLOSS can be of high quality.
    6. A similar study by Reasoning found that the MySQL database (a leading FLOSS
       database) had fewer defects than a set of 200 proprietary programs used for
       comparison. In a similar manner to the previous study, on December 15, 2003,
       [250]Reasoning announced its analysis results comparing MySQL with various
       proprietary programs. MySQL had found 21 software defects in 236,000 source
       lines of code (SLOC), producing a defect density of 0.09 defects/KSLOC. Using
       a set of 200 recent proprietary projects (totalling 35 million SLOC), the
       same tools found a defect rate of 0.57 defects/KSLOC -- over six times the
       error rate. Again, not all defects are found by their tool, and this
       certainly doesn't prove that FLOSS will always be the highest quality, but it
       clearly shows that FLOSS can be of high quality.
    7. A study by Coverity found that the Linux kernel had far fewer defects than
       the industry average. [251]Code-analysis firm Coverity performed a four-year
       research effort and found that the Linux kernel has significantly fewer
       software bugs in it than the industry average. Coverity's approach reported
       985 defects in the 5.7 million lines of code in the that make up the Linux
       kernel. According to data from Carnegie Mellon University, a typical program
       of similar size would usually have more than 5,000 defects. Coverity CEO Seth
       Hallem summarized this by saying, "Linux is a very good system in terms of
       bug density". It's not known how this compares to Microsoft Windows; Coverity
       did not have access to source code for the Microsoft Windows kernel. Coverity
       also did not have the source code for the many third-party drivers for
       Windows; these would need to be included for an accurate comparison,
       especially since Windows driver problems are known to be a significant
       problem in the reliability of many Windows deployments.
       [252]Coverity reported newer results in August 2005, showing defect densities
       were very low (and had even gone down further). Their follow-up analysis of
       Linux kernel 2.6.12 found that all six critical defects they had found in
       their earlier study f Linux kernel 2.6.9 had been fixed. The August 2005
       study found an average of 0.16 defects/KSLOC, down from 0.17 defects/KSLOC,
       even though the amount of code had increased, and "Although contributors
       introduced new defects, these were primarily in non-critical device drivers".
    8. Sites using Microsoft's IIS web serving software have over double the time
       off-line (on average) than sites using the Apache software, according to a
       3-month Swiss evaluation. These are the results of [253]Syscontrol AG's
       analysis of website uptime (announced February 7, 2000) They measured over
       100 popular Swiss web sites over a three-month period, checking from 4
       different locations every 5 minutes (it'd be interesting to see what a larger
       sample would find!). You can [254]see their report (in German), or a
       [255]Babelfish (machine) translation of the report. Here's their set of
       published data on "average down-time (in hours in that month) for each type
       of server", plus a 3-month average that I've computed:

   Downtime

                                                                              Apache

                                                                           Microsoft

                                                                            Netscape

                                                                               Other

   September

                                                                                5.21

                                                                               10.41

                                                                                3.85

                                                                                8.72

   October

                                                                                2.66

                                                                                8.39

                                                                                2.80

                                                                               12.05

   November

                                                                                1.83

                                                                               14.28

                                                                                3.39

                                                                                6.85

   Average

                                                                                3.23

                                                                               11.03

                                                                                3.35

                                                                                9.21

       It's hard not to notice that Apache (the OSS web server) had the best results
       over the three-month average (and with better results over time, too).
       Indeed, Apache's worst month was better than Microsoft's best month. The
       difference between Netscape and Apache is statistically insignificant - but
       this still shows that the freely-available FLOSS solution (Apache) has a
       reliability at least as good as the most reliable proprietary solution.
       The report does state that this might not be solely the fault of the
       software's quality, and in particular it noted that several Microsoft IIS
       sites had short interruptions at the same time each day (suggesting regular
       restarts). However, this still begs the question - why did the IIS sites
       require so many regular restarts compared to the Apache sites? Every outage,
       even if pre-planned, results in a service loss (and for e-commerce sites, a
       potential loss of sales). Presumably, IIS site owners who perform periodic
       restarts do so because they believe that doing so will improve their IIS
       systems' overall reliability. Thus, even with pre-emptive efforts to keep the
       IIS systems reliable, the IIS systems are less reliable than the Apache-based
       systems which simply do not appear to require constant restarting.
    9. 80% of the top ten most reliable hosting providers ran FLOSS, according to
       Netcraft's May 2004 survey [256]Netcraft's May 2004 survey of the top ten
       most reliable hosting providers found 4 running GNU/Linux, 4 running FreeBSD,
       and only 2 running Microsoft Windows.
   10. FLOSS did very well in a separate uptime study by Netcraft; as of August 3,
       2001, of the 50 sites with the highest uptimes, 92% use Apache and 50% run on
       FLOSS OSes. Netcraft keeps a track of the 50 often-requested sites with the
       longest uptimes at [257]http://uptime.netcraft.com. Looking at [258]the
       August 3, 2001 uptime report, I found that 92% (46/50) of the sites use
       Apache; one site's web server was unknown, and three others were not Apache.
       Of those three, only one reported to be Microsoft IIS, and that one instance
       is suspicious because its reported OS is BSD/OS (this apparent inconsistency
       can be explained in many ways, e.g., perhaps there is a front-end BSD/OS
       system that "masks" the IIS web site, or perhaps the web server is lying
       about its type to confuse attackers). In this snapshot, 50% (25/50) ran on an
       FLOSS OS, and only Unix-like OSes had these large uptimes (no Windows systems
       were reported as having the best uptimes).
       As with all surveys, this one has weaknesses, as discussed in [259]Netcraft's
       Uptime FAQ. Their techniques for identifying web server and OSes can be
       fooled. Only systems for which Netcraft was sent many requests were included
       in the survey (so it's not "every site in the world"). Any site that is
       requested through the "what's that site running" query form at Netcraft.com
       is added to the set of sites that are routinely sampled; Netcraft doesn't
       routinely monitor all 22 million sites it knows of for performance reasons.
       Many OSes don't provide uptime information and thus can't be included; this
       includes AIX, AS/400, Compaq Tru64, DG/UX, MacOS, NetWare, NT3/Windows 95,
       NT4/Windows 98, OS/2, OS/390, SCO UNIX, Sony NEWS-OS, SunOS 4, and VM. Thus,
       this uptime counter can only include systems running on BSD/OS, FreeBSD (but
       not the default configuration in versions 3 and later), recent versions of
       HP-UX, IRIX, GNU/Linux 2.1 kernel and later (except on Alpha processor based
       systems), MacOS X, recent versions of NetBSD/OpenBSD, Solaris 2.6 and later,
       and Windows 2000. Note that Windows NT systems cannot be included in this
       survey (because their uptimes couldn't be counted). Windows 2000 systems's
       data are included in the source source for this survey, but they have a
       different problem. Windows 2000 had little hope to be included in the August
       2001 list, because the 50th system in the list had an uptime of 661 days, and
       Windows 2000 had only been launched about 17 months (about 510 days) earlier.
       Note that HP-UX, GNU/Linux (usually), Solaris and recent releases of FreeBSD
       cycle back to zero after 497 days, exactly as if the machine had been
       rebooted at that precise point. Thus it is not possible to see an HP-UX,
       GNU/Linux (usually), or Solaris system with an uptime measurement above 497
       days, and in fact their uptimes can be misleading (they may be up for a long
       time, yet not show it). There is yet one other weakness: if a computer
       switches operating systems later, the long uptime is credited to the new OS.
       Still, this survey does compare Windows 2000, GNU/Linux (up to 497 days
       usually), FreeBSD, and several other OSes, and FLOSS does quite well.
       It could be argued that perhaps systems on the Internet that haven't been
       rebooted for such a long time might be insignificant, half-forgotten,
       systems. For example, it's possible that security patches aren't being
       regularly applied, so such long uptimes are not necessarily good things.
       However, a counter-argument is that Unix and Linux systems don't need to be
       rebooted as often for a security update, and this is a valuable attribute for
       a system to have. Even if you accepted that unproven claim, it's certainly
       true that there are half-forgotten Windows systems, too, and they didn't do
       so well. Also, only systems someone specifically asked for information about
       were included in the uptime survey, which would limit the number of
       insignificant or half-forgotten systems.
       At the very least, Unix and Linux are able to quantitatively demonstrate
       longer uptimes than their Windows competitors can, so Unix and Linux have
       significantly better evidence of their reliability than Windows.
   11. An in-depth analysis (published in the Communications of the ACM) found good
       evidence that FLOSS code quality appears to be at least equal and sometimes
       better than proprietary software. The article "Open Source Software
       Development Should Strive for Even Greater Code Maintainability" by Ioannis
       Samoladas, Ioannis Stamelos, Lefteris Angelis, and Apostolos Oikonomou, was
       published by the highly-respected "Communications of the ACM" in October 2004
       (pp. 83-87). A minor variation of this paper was reprinted and made globally
       accessible in [260]Programming Languages, Vol. 2, No. 9 - Dec/Jan 2004-2005.
       The authors studied almost 6 million lines of code, tracking several programs
       over time, using the maintainability index (chosen by the Software
       Engineering Institute as the most suitable tool for measuring the
       maintainability of systems). Using their measurements, they concluded that
       FLOSS "code quality appears to be at least equal and sometimes better than
       the quality of [closed source software] code implementing the same
       functionality". They conjectured that this "may be due to the motivation of
       skilled OSS programmers..". FLOSS is no panacea; they also found that FLOSS
       "code quality seems to suffer from the very same problems that have been
       observed in [closed source software] projects. Maintainability deterioration
       over time is a typical phenomenon... it is reasonable to expect similar
       behavior from the OSS projects as they age". Clearly, FLOSS is not a silver
       bullet; developers of FLOSS programs have to work to keep their programs
       maintainable, and it is difficult to keep a program maintainable as it grows
       over time. FLOSS was found to have equal and sometimes better maintainability
       than proprietary programs, and that is a very encouraging result.
   12. A detailed study of two large programs (the Linux kernel and the Mozilla web
       browser) found evidence that FLOSS development processes produce more modular
       designs. Harvard Business School's [261]"Exploring the Structure of Complex
       Software Designs: An Empirical Study of Open Source and Proprietary Code" by
       Alan MacCormack, John Rusnak, and Carliss Baldwin (Working Paper Number
       05-016) reports research results that worked to see if FLOSS programs tended
       to have better modularity than proprietary programs. It's generally accepted
       that there are important benefits to greater modularity, in particular, a
       more modular system tends to be more reliable and easier to change over time.
       They examined the Linux kernel (developed as an FLOSS product), the original
       Mozilla web browser (developed as a proprietary product), and then the
       evolution of Mozilla after it became FLOSS. They found "significant
       differences in their designs"; Linux possessed a more modular architecture
       than the original proprietary Mozilla, and the redesigned FLOSS Mozilla had a
       more modular structure than both.
       To measure design modularity, they used a technique called Design Structure
       Matrices (DSMs) that identified dependencies between different design
       elements (in this case, between files, where calling a function/method of
       another file creates a dependency). They used two different measures using
       DSMs, which produced agreeing results.
       The first measure they computed is a simple one, called "change cost". This
       measures the percentage of elements affected, on average, when a change is
       made to one element in the system. A smaller value is better, since as this
       value gets larger, it's becomes increasingly likely that a change made will
       impact a larger number of other components and have unintended consequences.
       This measure isn't that sensitive to the size of a system (see their exhibit
       7), though obviously as a program gets larger that percentage implies a
       larger number of components. When Mozilla was developed as a proprietary
       product, and initially released as FLOSS, it had the large value of 17.35%.
       This means that if a given file is changed, on average, 17.35% of other files
       in system depend (directly or indirectly) on that file. After gaining some
       familiarity with the code, the FLOSS developers decided to improve its design
       between 1998-10-08 and 1998-12-11. Once the redesign was complete, the change
       cost dramatically decreased down to 2.78%, as you can see:

                                 Program       Change Cost
                            Mozilla-1998-04-08 17.35%
                            Mozilla-1998-10-08 18.00%
                            Mozilla-1998-12-11 2.78%
                            Mozilla-1999       3.80%
                            Linux-2.1.88       3.72%
                            Linux-2.1.105      5.16%

       Change cost is a fairly crude measure, though; it doesn't take into account
       the amount of dependency (measured, say, as the number of calls from one file
       to another), and it doesn't take clustering into account (a good design
       should minimize the communication between clusters more than communication in
       general). Thus, they computed "coordination cost," an estimated cost of
       communicating information between agents developing each cluster. This
       measure is strongly dependent on the size of the system - after all, it's
       easier to coordinate smaller projects. Thus, to use this as a measure of the
       quality of a design compared to another project, the sizes must be similar
       (in this case, by the number of files). The numbers are unitless, but smaller
       costs are better. The researchers identified different circumstances with
       similar sizes, so that the numbers could be compared. The following table
       compares Mozilla 1998-04-08 (built almost entirely by proprietary means) and
       Mozilla 1998-12-11 (just after the redesign by FLOSS developers) with Linux
       2.1.105 (built by FLOSS processes):

                             Linux 2.1.105 Mozilla 1998-04-08 Mozilla 1998-12-11
      Number of Source files 1678          1684               1508
        Coordination Cost    20,918,992    30,537,703         10,234,903

       The paper computes numbers for several other cases, but yielding the same
       conclusion.
       It'd be easy to argue that kernels are fundamentally different than web
       browsers, but that can't be the right explanation. When Mozilla was released
       to the FLOSS community, it was far worse by these measures, and the FLOSS
       community actively and consciously worked to improve its modularity. The
       browser soon ended up with a significant and measurable improvement in
       modularity, better than the kernel's, without an obvious complete loss of
       functionality.
       It appears that at least part of the explanation is in the FLOSS development
       environment. FLOSS development is normally distributed worldwide, with little
       opportunity for face-to-face communication, and with many people contributing
       only part-time. Thus, "this mode of organization was only possible given that
       the design structure, and specifically, the partitioning of design tasks, was
       loosely-coupled". In addition, the leadership of an FLOSS project is
       incentivized to make architectural decisions that lead to modularity, since
       if they didn't, they wouldn't be able to attract enough co-developers:
       "Without such an architecture, there was little hope that other contributors
       could a) understand enough of the design to contribute in a meaningful way,
       and b) develop new features or fix existing defects without affecting many
       other parts of the design". Although not discussed in the paper, cultural
       norms may also be a factor; since the source code is reviewed by others,
       developers appear to actively disparage poor designs and praise highly
       modular designs.
       Again, this does not mean that FLOSS programs are always more modular; but it
       does suggest that there is pressure to make modular programs in an FLOSS
       project.
   13. German import company Heinz Tröber found Linux-based desktops to be far more
       reliable than Windows desktops; Windows had a 15% daily failure rate, while
       Linux has 0%. [262]Günter Stoverock, the data processing manager at German
       import company Heinz Tröber, reported that they had decided to run its ERP
       software on Linux-based systems, instead of Windows, because Windows was much
       less reliable. Stoverock stated that on Windows, "Out of 65 desktops, around
       10 desktops crashed daily... Employees wasted around 30 minutes, that's five
       times 30 minutes per week". Note that this is a 15% daily failure rate, and
       the actual impacts were almost certainly more severe than simply a loss of 2
       minutes of lost time per reboot. After all, this generous calculation ignores
       the cost of lost time due to lost data (requiring re-entry), time to restart
       whatever action they were doing, and the time for people to regain their
       focus on what they were doing. Stoverock then stated "That's not acceptable
       -- we had to do something [to solve this]". The company switched to Linux
       desktop systems in 2001, and has had no downtime at all since (through March
       2005). He reported that "There are no problems -- in the morning you turn the
       computer on, in the afternoon you turn it off -- that's it". I do not have
       more detailed information than this about their particular environment and
       results, which is a significant limitation of this report. On the other hand,
       I found no evidence that they have any reason to prefer either platform, and
       it appears that the functionality and usage was the same on both platforms,
       suggesting that this is valid comparison.

   Damien Challet and Yann Le Du of the University of Oxford have written a paper
   titled [263]Closed source versus open source in a model of software bug dynamics.
   In this paper they develop a model of software bug dynamics where users,
   programmers and maintainers interact through a given program. They then analyzed
   the model, and found that all other things being equal (such as number of users,
   programmers, and quality of programmers), "debugging in open source projects is
   always faster than in closed source projects".

   Of course, there are many anecdotes about Windows reliability vs. Unix. For
   example, the [264]Navy's "Smart Ship" program caused a complete failure of the
   USS Yorktown ship in September 1997. Whistle-blower Anthony DiGiorgio stated that
   Windows is "the source of the Yorktown's computer problems". Ron Redman, deputy
   technical director of the Fleet Introduction Division of the Aegis Program
   Executive Office, said "there have been numerous software failures associated
   with [Windows] NT aboard the Yorktown". Redman also said "Because of politics,
   some things are being forced on us that without political pressure we might not
   do, like Windows NT... If it were up to me I probably would not have used Windows
   NT in this particular application. If we used Unix, we would have a system that
   has less of a tendency to go down".

   Reliability is increasing important in software. ABI Research 2004 study
   "Automotive Electronics Systems: Market Requirements for Microcontrollers,
   Accelerometers, Hall Effect and Pressure Sensors" found that [265]approximately
   30% of all automotive warranty issues today are software and silicon-related.

   One problem with reliability measures is that it takes a long time to gather data
   on reliability in real-life circumstances. Thus, there's more data comparing
   older Windows editions to older GNU/Linux editions. The key is that these
   comparisons are fair, because they compare contemporaneous products. The
   available evidence suggests that FLOSS has a significant edge in reliability, at
   least in many circumstances.

                                      4. Performance

   Comparing GNU/Linux and Microsoft Windows performance on equivalent hardware has
   a history of contentious claims and different results based on different
   assumptions. FLOSS has at least shown that it's often competitive, and in many
   circumstances it beats the competition.

   Performance benchmarks are very sensitive to the assumptions and environment, so
   the best benchmark is one you set up yourself to model your intended environment.
   Failing that, you should use unbiased measures, because it's so easy to create
   biased measures.

   First, here are a few recent studies suggesting that some FLOSS systems beat
   proprietary competitors in at least some circumstances:
    1. Linux has done well in TPC database measures. In 2002, TPC-C database
       measures found that a Linux based system was faster than a Windows 2000 based
       system. In 2008, RHEL 5 did extremely well. More specifically, in 2002, an HP
       ProLiant DL580 with 32 Intel Xeon 900MHz CPUs running Oracle 9i R2 Enterprise
       edition ran faster running on a stock Red Hat Linux Advanced Server than on
       Microsoft Windows 2000 Advanced Server. You can see the [266]Linux and
       [267]Windows reports; note that [268]HP did not modify the Linux kernel to
       get these results.
       In 2008, [269]In a recent independent test, Red Hat Enterprise Linux (RHEL) 5
       Advanced Platform did better than all other operating systems that could
       process more than 1 million transactions per minute, and at 22% lower cost
       than its next closest competitor. The Transaction Processing Performance
       Council (TPC) validated Red Hat's processing of 1.2 million transactions per
       minute on an IBM System x 3950M2 with the Intel X7460 Xeon processor. Total
       IBM/Red Hat hardware and software costs were $1.99 per transaction, 22%
       cheaper than the next less expensive hardware/software combination (IBM/AIX
       at $2.81 per transaction). No Windows-based system performed as well, and
       Unix systems with faster performance cost at least 50% more.
    2. PC Magazine's November 2001 performance tests for file servers found that
       Linux with Samba significantly outperformed Windows 2000. Their article
       [270]Performance Tests: File Server Throughput and Response Times found that
       Linux with Samba significantly outperformed Windows 2000 Server when used as
       a file server for Microsoft's own network file protocols. This was true
       regardless of the number of simultaneous clients (they tested a range up to
       30 clients), and it was true on the whole range on computers they used
       (Pentium II/233MHz with 128MiB RAM, Pentium III/550MHz with 256MiB RAM, and
       Pentium III/1GHz with 512MiB RAM, where [271]MiB is 2^20 bytes). Indeed, as
       the machines became more capable the absolute difference grew more
       pronounced. On the fastest hardware while handling largest number of clients,
       GNU/Linux's throughput was about 130 MB/sec vs. Windows' 78 MB/sec (GNU/Linux
       was 78% faster).
    3. PC Magazine tested file server performance again in April 2002; Linux with
       Samba beat Windows 2000 again, but Samba then surpassed Windows 2000 by about
       100% and can handle 4 times as many clients. PC Magazine published another
       comparison of Samba and Windows (a summary is available electronically as
       [272]"Samba runs rings around Win2000".). They noted that the later Samba
       software surpasses the performance of Windows 2000 by about 100 percent under
       benchmark tests, and found that Linux and Samba can handle four times as many
       client systems as Windows 2000 before performance begins to drop off. Jay
       White, IT manager at electronics firm BF Group, said that Samba is one of the
       most useful pieces of server software available for a mixed Windows and Linux
       environment. "Our Samba server has been online for 394 days so far. The total
       cost is the hardware plus 30 minutes of my time each year," he said. Mark
       Twells, IT coordinator at a large education facility, said, "We run six Samba
       servers on a variety of hardware [and] we have around 1,000 users".; this
       certainly excellent evidence of Samba's utility.
    4. In performance tests by Sys Admin magazine, GNU/Linux beat Solaris (on
       Intel), Windows 2000, and FreeBSD. The article [273]"Which OS is Fastest for
       High-Performance Network Applications?" in the July 2001 edition of [274]Sys
       Admin magazine examined high-performance architectures and found that
       GNU/Linux beat its competition when compared with Solaris (on Intel), FreeBSD
       (an FLOSS system), and Windows 2000. They intentionally ran the systems "out
       of the box" (untuned), except for increasing the number of simultaneous
       TCP/IP connections (which is necessary for testing multi-threaded and
       asynchronous applications). They used the latest versions of OSes and the
       exact same machine. They reported (by OS) the results of two different
       performance tests.
       The FreeBSD developers complained about these tests, noting that FreeBSD by
       default emphasizes reliability (not speed) and that they expected anyone with
       a significant performance need would do some tuning first. Thus, [275]Sys
       Admin's re-did the tests for FreeBSD after tuning FreeBSD. One change they
       made was switching to "asynchronous" mounting, which makes a system faster
       (though it increases the risk of data loss in a power failure) - this is the
       GNU/Linux default and easy to change in FreeBSD, so this was a very small and
       reasonable modification. However, they also made many other changes, for
       example, they found and compiled in 17 FreeBSD kernel patches and used
       various tuning commands. The other OSes weren't given the chance to "tune"
       like this, so comparing untuned OSes to a tuned FreeBSD isn't really fair.
       In any case, here are their two performance tests:
         1. Their "real-world" test measured how quickly large quantities of email
            could be sent using their email delivery server (MailEngine). Up to 100
            simultaneous sends there was no difference, but as the number increased
            the systems began showing significant differences in their hourly email
            delivery speed. By 500 simultaneous sends GNU/Linux was clearly faster
            than all except FreeBSD-tuned, and GNU/Linux remained at the top.
            FreeBSD-tuned had similar performance to GNU/Linux when running 1000 or
            less simultaneous sends, but FreeBSD-tuned peaked around 1000-1500
            simultaneous connections with a steady decline not suffered by
            GNU/Linux, and FreeBSD-tuned had trouble going beyond 3000 simultaneous
            connections. By 1500 simultaneous sends, GNU/Linux was sending 1.3
            million emails/hour, while Solaris managed approximately 1 million, and
            Windows 2000 and FreeBSD-untuned were around 0.9 million.
         2. Their "disk I/O test" created, wrote, and read back 10,000
            identically-sized files in one directory, varying the size of the file
            instances. Here Solaris was the slowest, with FreeBSD-untuned the
            second-slowest. FreeBSD-tuned, Windows 2000, and GNU/Linux had similar
            speeds at the smaller file sizes (in some cases FreeBSD-tuned was
            faster, e.g., 8k and 16k file size), but when the file sizes got to 64k
            to 128k the OSes began to show significant performance differences;
            GNU/Linux was the fastest, then Windows 2000, then FreeBSD. At 128k,
            FreeBSD was 16% worse than Windows 2000, and 39% worse than GNU/Linux;
            all were faster than FreeBSD-untuned and Solaris. When totaling these
            times across file sizes, the results were GNU/Linux: 542 seconds,
            Windows 2000: 613 seconds, FreeBSD-tuned: 630 seconds, FreeBSD-untuned:
            2398 seconds, and Solaris: 3990 seconds.
    5. GNU/Linux with TUX has produced better SPEC values than Windows/IIS in
       several cases, even when given inferior drive configurations. One
       organization that tries to develop unbiased benchmarks is the [276]SPEC
       Consortium, which develops and maintains a whole series of benchmarks. We can
       compare Microsoft Windows versus GNU/Linux by comparing SPECweb99 results
       (which measure web server performance) on identical hardware if both have
       undergone the same amount of performance optimization effort. Alas, things
       are not so simple; rarely are the same basic hardware platforms tested with
       both OSes, and even when that occurs, as of July 13, 2001 no exactly
       identical configurations have been tested (they differ in ways such as using
       a different number of hard drives, or including some faster hard drives).
       Using all results available by July 13, 2001, there were three hardware
       configurations, all from Dell, which ran both GNU/Linux (using the TUX web
       server/accelerator) and Windows (using IIS) on exactly the same underlying
       hardware. Here are the SPECweb99 results as of July 13, 2001 (larger is
       better), noting configuration differences:

   System Windows SPEC Result Linux SPEC Result
   Dell PowerEdge 4400/800, 2 800MHz Pentium III Xeon 1060 (IIS 5.0, 1 network
   controller) 2200 (TUX 1.0, 2 network controllers)
   Dell PowerEdge 6400/700, 4 700MHz Pentium III Xeon 1598 (IIS 5.0, 7 9GB 10KRPM
   drives) 4200 (TUX 1.0, 5 9GB 10KRPM drives)
   Dell PowerEdge 8450/700, 8 700MHz Pentium III Xeon 7300/NC (IIS 5.0, 1 9Gb 10KRPM
   and 8 16Gb 15KRPM drives) then 8001 (IIS 5.0, 7 9Gb 10KRPM and 1 18Gb 15KRPM
   drive) 7500 (TUX 2.0, 5 9Gb 10KRPM drives)

       The first row (the PowerEdge 4400/800) doesn't really prove anything. The IIS
       system has lower performance, but it only had one network controller and the
       TUX system has two - so while the TUX system had better performance, that
       could simply be because it had two network connections it could use.
       The second entry (the PowerEdge 6400/700) certainly suggests that GNU/Linux
       plus TUX really is much better - the IIS system had two more disk drives
       available to it (which should increase performance), but the TUX system had
       over twice the IIS system's performance.
       The last entry for the PowerEdge 8450/700 is even more complex. First, the
       drives are different - the IIS systems had at least one drive that revolved
       more quickly than the TUX systems (which should give IIS higher performance
       overall, since the transfer speed is almost certainly higher). Also, there
       were more disk drives (which again should give IIS still higher performance).
       When I originally put this table together showing all data publicly available
       in April 2001 (covering the third quarter of 1999 through the first quarter
       of 2001), IIS 5.0 (on an 8-processor Dell PowerEdge 8450/700) had a SPECweb99
       value of 7300. Since that time, Microsoft changed the availability of
       Microsoft SWC 3.0, and by SPECweb99 rules, this means that those test results
       are "not compliant" (NC). This is subtle; it's not that the test itself was
       invalid, it's that Microsoft changed what was available and used the SPEC
       Consortium's own rules to invalidate a test (possibly because the test
       results were undesirable to Microsoft). A retest then occurred, with yet
       another disk drive configuration, at which point IIS produced a value of
       8001. However, both of these figures are on clearly better hardware - and in
       one circumstance the better hardware didn't do better.
       Thus, in these configurations the GNU/Linux plus TUX system was given
       inferior hardware yet still sometimes won on performance. Since other factors
       may be involved, it's hard to judge - there are pathological situations where
       "better hardware" can have worse performance, or there may be another factor
       not reported that had a more significant effect. Hopefully in the future
       there will be many head-to-head tests in a variety of identical
       configurations.
       Note that TUX is intended to be used as a "web accelerator" for many
       circumstances, where it rapidly handles simple requests and then passes more
       complex queries to another server (usually Apache). I've quoted the TUX
       figures because they're the recent performance figures I have available. As
       of this time I have no SPECweb99 figures or other recent performance measures
       for Apache on GNU/Linux, or for Apache and TUX together; I also don't have
       TUX reliability figures. I expect that such measures will appear in the
       future.
    6. Low-level benchmarks by IBM found that GNU/Linux had better performance than
       Windows for pipes (an input/output mechanism), and also process and thread
       creation. Ed Bradford (manager of Microsoft Premier Support for IBM Software
       group) published in October 2001 the study [277]Pipes in Linux, Windows 2000,
       and Windows XP. In this study he examined the the performance of pipes, a
       common low-level mechanism for communicating between program processes. He
       found the pipes in Red Hat 7.1 (with Linux kernel version 2.4.2) had a peak
       I/O rate of around 700 MB/sec, with a steady state at near 100 MB/sec for
       very large block sizes. In contrast, Windows 2000 peaked at 500 MB/sec, with
       a large block steady state of 80 MB/sec. Windows XP Professional (evaluation
       version) was especially disappointing; its peak I/O rate was only 120 MB/sec,
       with a stead state of 80 MB/sec, all on the same platform and all running a
       GUI.
       In February 2002 he published [278]Managing processes and threads, in which
       he compared the performance of Red Hat Linux 7.2, Windows 2000 Advanced
       Server ("Win2K"), and Windows XP Professional ("WinXP"), all on a Thinkpad
       600X with 320MiB of memory. Linux managed to create over 10,000
       threads/second, while Win2K didn't quite manage 5,000 threads/second and
       WinXP only created 6,000 threads/second. In process creation, Linux managed
       330 processes/second, while Win2K managed less than 200 processes/second and
       WinXP less than 160 processes/second.
    7. eWeek found in its tests that the FLOSS program MySQL was quite comparable to
       the proprietary Oracle database program, and the pair outperformed other
       proprietary programs. [279]eWeek Labs/PC Labs compared several database
       packages and released the results on February 25, 2002. Comparable
       performance measures of database programs are actually quite rare. As they
       note, "database vendors routinely use no-benchmarking clauses in their
       license agreements to block publication of benchmarks of which they do not
       approve". Indeed, to their knowledge, this is the first time a computer
       publication has published database benchmark results tested on the same
       hardware since PC Magazine did so in October 1993 (almost 9 years earlier).
       However, they took the risk and published the results examining five server
       databases: IBM's DB2 7.2 with FixPack 5, Microsoft Corp.'s SQL Server 2000
       Enterprise Edition with Service Pack 2, MySQL AB's MySQL 4.0.1 Max, Oracle
       Corp.'s Oracle9i Enterprise Edition 9.0.1.1.1, and Sybase Inc.'s ASE
       (Adaptive Server Enterprise) 12.5.0.1. Their goal was to create a level
       playing field to determine which database performed best when used with a
       Java-based application server.
       The results? They found that overall Oracle9i and MySQL had the best
       performance and scalability; Oracle9i was slightly ahead of MySQL in most
       cases, but Oracle costs far more. "ASE, DB2, Oracle9i and MySQL finished in a
       dead heat up to about 550 Web users. At this point, ASE's performance leveled
       off at 500 pages per second, about 100 pages per second less than Oracle9i's
       and MySQL's leveling-off point of about 600 pages per second. DB2's
       performance dropped substantially, leveling off at 200 pages per second under
       high loads. Due to its significant JDBC (Java Database Connectivity) driver
       problems, Microsoft's SQL Server was limited to about 200 pages per second
       for the entire test".
       Naturally, "Manual tuning makes a huge difference with databases - in
       general, our final measured throughput was twice as fast as our initial
       out-of-the-box test runs". In this case, they found that "SQL Server and
       MySQL were the easiest to tune, and Oracle9i was the most difficult because
       it has so many separate memory caches that can be adjusted".
       MySQL also demonstrated some significant innovation. Its performance was due
       primarily to its "query cache", a capability not included in any other
       database. If the text of a query has a byte-for-byte match with a cached
       query, MySQL can retrieve the results directly from its cache without
       compiling the query, getting locks or doing index accesses. Obviously, this
       technique is only effective for tables with few updates, but it certainly
       made an impact on this benchmark and is a helpful optimization for many
       situations. MySQL also supports different database engines on a
       table-by-table basis; no other tested database had this feature.
       They also found that of the five databases they tested, only Oracle9i and
       MySQL were able to run their test application as originally written for 8
       hours without problems. They had to work around various problems for all the
       others.
       In this case, an FLOSS program beat most of its proprietary competition in
       both performance and reliability (in terms of being able to run a
       correctly-written application without problems). A proprietary program
       (Oracle) beat it, but barely, and its competitor is far more expensive. It
       certainly is arguable that MySQL is (for this application) a comparable
       application worthy of consideration.
       [280]MySQL AB also reports other benchmark results comparing MySQL with other
       products; however, since they are not an independent lab, I'm not
       highlighting their results here.
    8. In February 2003, scientists broke the Internet2 Land Speed Record using
       GNU/Linux. [281]Scientists sent 6.7 GB of uncompressed data at 923 megabits
       per second in just 58 seconds from Sunnyvale, California, to Amsterdam - the
       equivalent of four hours of DVD-quality movies, using a transfer speed 3,500
       times faster than a typical household broadband connection. The team used PCs
       running Debian GNU/Linux in Amsterdam and Red Hat Linux in Sunnyvale,
       California.
    9. Benchmarks comparing Sun Solaris x86 and GNU/Linux found many similarities,
       but GNU/Linux had double the performance in web operations. Tony Bourke's
       October 2003 evaluation [282]Sun Versus Linux: The x86 Smack-down gave a
       general review comparing Sun Solaris x86 and Red Hat Linux. He found that
       "Performance was overall similar for most of the metrics tested, perhaps with
       Linux in a very slight lead. However, with the web operations test (arguably
       the most important and relevant), Linux is a clear winner". He found that,
       given the same web serving programs and configuration, GNU/Linux supported
       over 2000 fetches/second while Solaris x86 supported less than 1000
       fetches/second.
   10. Anandtech's August 2005 comparison of Mac OS X and GNU/Linux found that the
       Linux-based system ran five to eight times faster on server tasks
       (specifically using MySQL). [283]Anandtech ran Linux on a slightly slower
       system, and Mac OS X on a slightly faster system. With effort they showed
       that the poor performance they'd seen earlier in Mac OS X was not due to the
       hardware, but to the operating system itself, because changing operating
       systems on essentially the same hardware produced radically different
       performance results. In particular, They found Linux created processes and
       threads, raised signals, and performed other interprocess communication far
       more rapidly than Mac OS X. Note that many Linux systems are exclusively
       FLOSS, while Apple's Mac OS X is a mix of proprietary and FLOSS (the result
       is very much proprietary).
   11. Microsoft themselves found that two FLOSS operating systems, Linux and
       FreeBSD, had better performance than Windows by many measures. [284]Paul
       Murphy's "'Unix beats Windows' - says Microsoft!" article of November 8,
       2005, pointed out a Microsoft Research report about their research on their
       "Singularity" research prototype. The report compares their research
       prototype to Windows, Linux, and FreeBSD... exposing performance figures that
       compare these operating systems directly to each other. Murphy writes,
       "What's noteworthy about it is that Microsoft compared Singularity to FreeBSD
       and Linux as well as Windows/XP - and almost every result shows Windows
       losing to the two Unix variants". And where they didn't do as well, Murphy
       determines that it was because "there are better, faster, ways of doing these
       things in Unix, but these guys... either didn't know or didn't care". These
       numbers certainly don't prove that any one system is always the best
       performer, but it certainly justifies considering them.

   All OSes in active development are in a constant battle for performance
   improvements over their rivals. The history of comparing Windows and GNU/Linux
   helps put this in perspective:
    1. Ziff-Davis found that GNU/Linux with Apache beat Windows NT 4.0 with IIS by
       16%-50% depending on the GNU/Linux distribution. [285]Ziff-Davis compared
       Linux and Windows NT's performance at web serving. They found that "Linux
       with Apache beats NT 4.0 with IIS, hands down. SuSE, the least effective
       Linux, is 16% faster than IIS, and Caldera, the leader, is 50% faster".
    2. [286]Mindcraft released a report in April 1999 that claimed that Microsoft
       Windows NT Server 4.0 is 2.5 times faster than Linux (kernel 2.2) as a File
       Server and 3.7 times faster as a Web Server when running on a 4-CPU SMP
       system. Several people and organizations, such [287]Linux Weekly News (LWN)
       and [288]Dan Kegel, identified serious problems with this study. An obvious
       issue was that NT was specially tuned by Microsoft's NT experts, at
       Microsoft, while GNU/Linux was not tuned at all. Another issue is that the
       price/performance wasn't considered (nor was total expenditure kept constant
       - for the same amount of money, the GNU/Linux system could have had better
       hardware). Mindcraft claimed they asked for help, but they didn't use the
       documented methods for getting help nor did they purchase a support contract.
       Many were especially offended that even though this study was funded by
       Microsoft (one of the contestants) and held at their facility, neither
       Mindcraft's initial announcement nor its paper made any mention of this
       conflict-of-interest - and it could be easily claimed that their
       configuration was designed to put GNU/Linux at a disadvantage. Their
       configuration was somewhat bizarre - it assumed all web pages were static
       (typical big sites tend to use many dynamically generated pages) and that
       there were 100 or so clients connected via 100baseT (in 1999 a more typical
       situation would be that most clients are using slower 28.8 or 56 Kbps
       modems).
       Careful examination of the benchmark did find some legitimate Linux kernel
       problems, however. These included a TCP bug, the lack of "wake one"
       semantics, and SMP bottlenecks (see [289]Dan Kegel's pages for more
       information). The Linux kernel developers began working on the weaknesses
       identified by the benchmark.
    3. PC Week confirmed that Windows did indeed do better in this less probable
       configuration. In June 30, 1999, Mindcraft released their [290]Open Benchmark
       in conjunction with PC Week. While this didn't excuse Mindcraft's biases, it
       did make a convincing case that there were legitimate problems in the Linux
       kernel and Apache that made GNU/Linux a poorer-performing product in this
       somewhat improbable configuration (serving static web pages to clients with
       high-speed connections). Note that this configuration was considerably
       different than Ziff-Davis's, so the benchmarks don't necessarily conflict;
       it's merely that different assumptions can produce different results (as I've
       already stressed).
    4. The German magazine c't found that web sites with NT were better at static
       content and dual network connections, but GNU/Linux sites were better for
       sites with dynamic content and single connections. Their article [291]Mixed
       Double: Linux and NT as Web Server on the Test Bed examined Windows NT with
       IIS against GNU/Linux (kernel 2.2.9) with Apache on a machine with four
       Pentium II Xeon CPUs. They found that the performance winner depended on the
       situation (by now that should not be a surprise). If the web server primarily
       served static web pages through two high-performance network cards, NT's
       performance was better. However, they also noted that in sophisticated web
       sites this result didn't apply, because such sites tend to have primarily
       dynamic content, and that few sites had this kind of dual-network connection
       (when only one network board was available, GNU/Linux generally had an edge).
       They concluded that "Mindcraft's result can't be transferred to situations
       with mainly dynamic contents - the common case in nearly every sophisticated
       web site... In the web server areas most relevant for practical use, Linux
       and Apache are already ahead by at least one nose. If the pages don't come
       directly from the system's main memory, the situation is even reverted to
       favor Linux and Apache: Here, the [FLOSS] movement's prime products leave
       their commercial competitors from Redmond way behind". See their paper for
       more figures and background.
    5. Network Computing found that GNU/Linux with Samba ran at essentially the same
       speed as Windows for file serving. In their article [292]"Is it Time for
       Linux", Network Computing compared Red Hat Linux v5.2 running Samba 2.0.3
       against Microsoft Windows NT Server Enterprise Edition on a Pentium II-based
       HP NetServer LPr, stressing the machine with multiple reads and writes of
       small, medium and large files over the course of several hours.
       For file serving, they discovered only "negligible performance differences
       between the two for average workloads... [and] depending on the degree of
       tuning performed on each installation, either system could be made to surpass
       the other slightly in terms of file-sharing performance". Red Hat Linux
       slightly outperformed NT on file writes, while NT edged out Red Hat Linux on
       massive reads. Note that their configuration was primarily network-limited;
       they stated "At no point were we able to push the CPUs much over 50-percent
       utilization-the single NIC, full duplex 100BASE-T environment wouldn't allow
       it".
       They also noted that "examining the cost difference between the two licenses
       brings this testing into an entirely new light... the potential savings on
       licenses alone is eye-opening. For example, based on the average street price
       of $30 for a Windows NT client license, 100 licenses would cost around
       $3,000, plus the cost of an NT server license (around $600). Compare this to
       the price of a Red Hat Linux CD, or perhaps even a free download, and the
       savings starts to approach the cost of a low-end workgroup server. Scale that
       up to a few thousand clients and you begin to see the savings skyrocket". See
       this paper's section on [293]total cost of ownership.
    6. The Linux developers' various efforts to improve performance appear to have
       paid off. In June 2000, Dell measured the various SPECweb99 values noted
       above.

   There are other benchmarks available, but I've discounted them on various
   grounds:
    1. A more recent set of articles from eWeek on June 2001, shows some eye-popping
       performance numbers for GNU/Linux with TUX. However, although they compare it
       to Microsoft IIS, they don't include Microsoft's SWC (Scalable Web Cache),
       Microsoft's response to TUX - and omitting it makes this comparison less
       balanced. You can read more at [294]"Tux: Built for Speed", [295]"Smart
       Coding pays off Big", and [296]Kegel's detailed remarks.
    2. The ZDNet article [297]Take that! Linux beats MS in benchmark test, loudly
       trumpeted that GNU/Linux was the May 2001 performance leader in the TPC-H
       decision support (database) benchmark ("100Gb" category). However, this
       result should not be taken very seriously; the hardware that Linux ran on was
       more powerful than that of the runner-up (Windows 2000). Frankly, the more
       surprising fact than its top score (which can be easily explained by the
       hardware) is its mere measurement at all with this benchmark - traditionally
       only Microsoft's numbers are reported for this benchmark at this range. For
       more information, see [298]the TPC results.

   More information on various benchmarks is available from Kegel's [299]NT vs.
   Linux Server Benchmark Comparisons, [300]SPEC, and the [301]dmoz entry on
   benchmarking.

   Remember, in benchmarking, everything depends on the configuration and
   assumptions that you make. Many systems are constrained by network bandwidth; in
   such circumstances buying a faster computer won't help at all. Even when network
   bandwidth isn't the limitation, much depends on what the products are designed to
   do. Neither Windows nor GNU/Linux do well in large-scale symmetric
   multiprocessing (SMP) shared memory configurations, e.g., for 64-way CPUs with
   shared memory. On the other hand, if you want massive distributed non-shared
   memory, GNU/Linux does quite well, since you can buy more CPUs with a given
   amount of money. If massive distribution can't help you and you need very high
   performance, Windows isn't even in the race; today Windows runs essentially only
   on Intel x86 compatible chips, while GNU/Linux runs on much higher performance
   processors as well as the x86.

                                      5. Scalability

   Which brings us to the topic of scalability, a simple term with multiple meanings
   all having to with either growing to large size, or being able to cover both
   small and large sizes. The large problems might include those needing hardware
   platforms with extremely high performance, massive storage, or a massive amount
   of software to do the job. The small sizes might include personal digital
   assistants (PDAs). However, there is much evidence that suggest that FLOSS can
   scale:
    1. GNU/Linux dominates in supercomputing: GNU/Linux is used in 78% of the
       world's 500 fastest supercomputers use GNU/Linux, most of the world's ten
       fastest supercomputers... including the world's most powerful supercomputer
       (as of March and November 2005). By March 2005 [302]Forbes noted that 60% of
       the world's fastest supercomputers use GNU/Linux, using data from [303]Top500
       to determine which computers are the world's fastest. Of those top 500, the
       best available information shows that 301 run GNU/Linux, 189 on Unix, 2 on
       FreeBSD (another FLOSS Unix variant), and one on Microsoft's Windows. A few
       machines' operating systems are unknown, but even so, Forbes says "Linux
       clearly is by far the top choice for high-performance computing".
       [304]Joe Greenseid reported on LWN that this dominance is even more obvious
       is the top ten supercomputers as of March 2005; GNU/Linux systems account for
       8 out of the top 10. Six of these ten were made by IBM, including five Blue
       Gene systems and one PPC Cluster. Third place is held by an SGI Altix running
       GNU/Linux. Thunder, an Intel Itanium2 Tiger4 "white box" system, holds
       seventh place and runs GNU/Linux.
       More recent data from November 2005 shows this as an increasing trend. Jay
       Lyman's November 15, 2005 article [305]Linux continues supercomputer
       domination notes that on the November 2005 Top500 list, 78% of the world's
       fastest machines (391/500) rely on Linux, far more than anything else. Seven
       of the top 10 systems are running GNU/Linux (the other three run AIX, UNICOS,
       and Super-UX), and as with the March 2005 survey, the fastest supercomputer
       in the world runs on GNU/Linux. In contrast, "Microsoft Windows didn't even
       turn up on the list". Erich Strohmaier, co-founder and editor of the Top500
       list, said that the FLOSS "Linux is the dominating OS in the supercomputing
       community and will keep this role... If anything, it will only enlarge its
       prevalence". In fact, he believes that "no other operating system is likely
       to be used as much as [GNU/]Linux in the foreseeable Top500 future".
       Strohmaier said there were at least two reasons for this: GNU/Linux was more
       cost-effective, and that it matches what many organizations already run on
       their servers. Instead, GNU/Linux "has become an industry standard in this
       community, and any other OS trying to break into this market (Mac OS X,
       Windows, etc.) would have to fight a steep, uphill battle".
       This increasing use of FLOSS operating systems in supercomputers has been a
       long-running trend. For years, GNU/Linux has been used in the most powerful
       computers in the world. GNU/Linux can be used to support massive parallel
       processing; a common approach for doing this is the [306]Beowulf
       architecture. In June 2001, the 42nd most powerful computer (according to the
       [307]TOP 500 Supercomputer list, June 2001) was [308]Sandia's Linux-based
       "CPlant". By May 2004, the [309]Lawrence Livermore National Laboratory's
       Linux-based "Thunder" delivered 19.94 teraflops, making it the second fastest
       on earth and the most powerful computer in North America. By November 2004,
       [310]IBM's Linux-based Blue Gene/L supercomputer became the most powerful
       supercomputer in the world, with 91.75 teraflops of peak floating point
       performance (as measured by the Linpack Fortran benchmark test) and 70.72
       teraflops of sustained performance. This system is based on Linux, and is
       only a quarter of its eventual planned size. Indeed, IBM plans for the Blue
       Gene family to eventually perform a quadrillion calculations per second (one
       petaflop). As of March 2005 Blue Gene/L was still the fastest supercomputer
       in the world, and it was running GNU/Linux. The Internet Archive -- the
       world's largest library in terms of the amount of text it retains -- uses an
       FLOSS operating system. The [311]Internet Archive crawls and archives the
       entire World Wide Web, including old versions of documents, subject to
       certain restrictions. They note that as of 2005 they archive approximately 1
       petabyte of data (one million gigabytes), growing at a rate of 20 terabytes
       per month. As they note, "This eclipses the amount of text contained in the
       world's largest libraries, including the Library of Congress". They do this
       on x86 machines using the GNU/Linux operating system. As of June 2005 they
       are changing their machine architecture, by adding an additional 1.5
       Petabytes of space ([312]see here for details), but they are still running
       running GNU/Linux. GNU/Linux and NetBSD (both FLOSS) support a wider range of
       hardware platforms and performance than any other OS. Many people mean by
       "scalability" to answer the question, "can you use the same software system
       for both small and large projects?" Often the implied issue is that you'd
       like to start with a modest system, but have the ability to grow the system
       as needs demand without costly modifications. Here FLOSS is unbeatable;
       because many people can identify scalability problems, and because its source
       code can be optimized for its platform, the scalability of many FLOSS
       products is amazing. Let's specifically look at GNU/Linux. GNU/Linux works on
       [313]PDAs (including the [314]Agenda VR3), [315]obsolete hardware (so you
       needn't throw the hardware away), common modern PC hardware, over a dozen
       different chipsets (not just Intel x86s), [316]mainframes, [317]massive
       clusters, and a [318]number of supercomputers. There's even a prototype
       implementation of GNU/Linux on a [319]wrist watch, And GNU/Linux runs on a
       vast number of different CPU chips, including the [320]x86, Intel Itanium,
       ARM, Alpha, IBM AS/400 (midrange), SPARC, MIPS, 68k, and Power PC, indeed,
       [321]the Linux kernel supports more different processors than any other
       operating system kernel ever has.. Another FLOSS operating system that widely
       scales to many other hardware platforms is [322]NetBSD.
       Thus, you can buy a small GNU/Linux or NetBSD system and grow it as your
       needs grow; indeed, you can replace small hardware with massively parallel or
       extremely high-speed processors or very different CPU architectures without
       switching OSes. Windows CE scales down to smaller platforms, but Windows
       simply does not scale up to the largest computing systems. Windows used to
       run on other platforms (such as the Alpha chips), but in practical terms,
       Windows is used and supported almost exclusively on x86 systems. Many Unix
       systems (such as Solaris) scale well to specific large platforms, but not as
       well to distributed or small platforms. In short, the most scalable and
       portable systems available are FLOSS.
    2. FLOSS development processes can scale to develop large software systems. At
       one time it was common to ask if the FLOSS process is "scalable," that is, if
       FLOSS processes could really develop large-scale systems. Bill Gates' 1976
       "Open Letter to Hobbyists" asked rhetorically, "Who can afford to do
       professional work for nothing? What hobbyist can put three man-years into
       programming, finding all bugs, documenting his product, and distribute it for
       free?" He presumed these were unanswerable questions - but he was wrong. See
       [323]my reports estimating GNU/Linux's size. For Red Hat Linux 6.2, I found
       the size to be over 17 million source lines of code (SLOC). Implemented
       traditionally it would have taken 4,500 person-years and over $600 million to
       implement this distribution. For Red Hat Linux 7.1, I found it to have over
       30 million SLOC, representing 8,000 person-years or $1 billion (a
       "Gigabuck"). Most developers ascribe to the design principle that components
       should be divided into smaller components where practical - a practice also
       applied to GNU/Linux - but some components aren't easily divided, and thus
       some components are quite large themselves (e.g., over 2 million lines of
       code for the kernel, mostly in device drivers). By October 2002,
       [324]Sourceforge.net announced that it had surpassed 500,000 registered users
       and supported almost 50,000 FLOSS projects - and a vast number of FLOSS
       projects don't use SourceForge. Thus, it's no longer reasonable to argue that
       FLOSS cannot scale to develop large systems -- because it clearly can.

                                       6. Security

   Quantitatively measuring security is very difficult. However, here are a number
   of attempts to do so, and they suggest that FLOSS is often superior to
   proprietary systems, at least in some cases. I'll concentrate on comparing FLOSS
   to Windows systems, since as noted above other proprietary systems are
   increasingly including FLOSS components (making comparisons more difficult).

   At one time the security of FLOSS systems was widely debated. Clearly FLOSS
   systems are not magically invincible from security flaws. But for most of those
   who study the question, the issue of whether or not FLOSS improves or reduces
   security appears to be an increasingly settled issue. The prestigious
   [325]Communications of the ACM published "Increased Security through Open Source"
   by Hoepman and Jacobs in January 2007 , which stated that "We believe open source
   software is a necessary requirement to build systems that are more secure....
   opening the source of existing systems will at first increase their exposure...
   However, this exposure (and the associated risk of using the system) can now be
   determined publicly. With closed source systems the perceived exposure may appear
   to be low, while the actual exposure... may be much higher. Moreover, because the
   source is open... the period of increased exposure is short. In the long run,
   openness of the source will increase its security... [and] it allows users to
   make a more informed choice about the security of a system...".

   Fundamentally FLOSS better meets the principle of "open design", a basic
   principle for developing secure systems identified by Saltzer and Schroeder long
   before FLOSS became popular. This principle itself isn't even new; Supreme Court
   Justice Louis Brandeis noted even earlier that "Publicity is justly commended as
   a remedy for social and industrial diseases. Sunlight is said to be the best of
   disinfectants; electric light the most efficient policeman".

   [326]The European Parliament approved A5-0264/2001 which calls "on the Commission
   and Member States to promote software projects whose source text is made public
   (open-source software), as this is the only way of guaranteeing that no backdoors
   are built into programmes [and calls] on the Commission to lay down a standard
   for the level of security of e-mail software packages, placing those packages
   whose source code has not been made public in the `least reliable' category" (5
   September, 2001; 367 votes for, 159 against and 39 abstentions).

   Here are some quantitative studies that back this up:

    1. [327]"Is Open Source Security a Myth?" by Guido Schryen (Communciations of
       the ACM, May 2011) gives excellent evidence that OSS should be considered.
       shows that, by their measure, OSS and proprietary software were roughly equal
       in security. Which means OSS security is not a myth. In this report, the
       author examined the NIST National Vulnerability Database, selected 17
       packages, and found that by this measure, "open source and closed source
       software do not significantly differ in terms of the severity of
       vulnerabilities, the type of development of vulnerability disclosure over
       time, and vendors' patching behavior. Although open source software
       development seems to prevent "extremely bad" patching behavior, overall there
       is no empirical evidence that the particular type of software development is
       the primary driver of security. Rather, the policy of the particular vendor
       determines the patching behavior". Indeed, it's not so clear that OSS is only
       "just as good" even by looking at the paper. The paper reports that "17.6%
       (30.4%) of the published open (closed) source software vulnerabilities (in
       terms of the median) are still unpatched". So on average, OSS had nearly half
       as many unpatched vulnerabilities... that does NOT sound like equality. I
       agree with the author, though, that whether or not something was unpatched
       depended far more on the supplier than on OSS vs. proprietary. Even more
       interestingly, the OSS vulnerabilities tended to be significantly less
       severe: "When we determine the medians of medians of open source software
       (5.7) and closed source software (6.8) and also the corresponding medians of
       the proportions of highly severe vulnerabilities (30.28% and 45.95%,
       respectively), the first impression is that open source software is more
       secure in terms of the severity level. However, applying statistical analysis
       (Mann-Whitney U-test) on the medians, no statistically significant
       differences can be found: the two-tailed test provides a high number for p
       (p=0.11). Applying the same test to the proportion figures, the test, again,
       does not indicate that the samples are significantly different at the 0.05
       level (p=0.06)". The statistical test with a p=0.06 technically doesn't meet
       the p=0.05 threshold (I think that is due to the small sample size), but I'd
       show up at ANY casino if I knew I was going to win 94% of the time.
    2. J.S. Wurzler Underwriting Managers' "hacker insurance" costs 5-15% more if
       Windows is used instead of Unix or GNU/Linux for Internet operation. At least
       one insurance company has indicated that Windows NT is less secure than Unix
       or GNU/Linux systems, resulting in higher premiums for Windows-based systems.
       It's often difficult to find out when a company has been successfully
       cracked; companies often don't want to divulge such information to the public
       for a variety of reasons. Indeed, if consumers or business partners lost
       trust in a company, the resulting loss might be much greater than the
       original attack. However, insurance companies that insure against cracking
       can require that they get such information (as a condition of coverage), and
       can compute future premiums based on that knowledge. According to Cnet,
       Okemos, Mich.-based J.S. Wurzler Underwriting Managers, one of the earliest
       agencies to offer "hacker insurance" (and thus more likely to have historical
       data for premium calculation), has begun [328]charging its clients anywhere
       from 5 to 15 percent more if they use Microsoft's Windows NT software instead
       of Unix or GNU/Linux for their Internet operations. Walter Kopf, senior vice
       president of underwriting, said that "We have found out that the possibility
       for loss is greater using the NT system". He also said the decision is based
       on findings from hundreds of security assessments the company has done on
       their small and midsize business clients over the past couple of years.
    3. Most defaced web sites are hosted by Windows, and Windows sites are
       disproportionately defaced more often than explained by its market share.
       Another way to look at security is to look at the OS used by defaced web
       sites, and compare them to their market share. A "defaced" web site is a site
       that has been broken into and has its content changed (usually in a fairly
       obvious way, since subtle modifications are often not reported). The
       advantage of this measure is that unlike other kinds of security break-ins
       (which are often "hushed up"), it's often very difficult for victims to hide
       the fact that they've been successfully attacked. Historically, this
       information was maintained by Attrition.org. A summary can be found in
       [329]James Middleton's article, with the actual data found in
       [330]Attrition.org's web site. Attrition.org's data showed that 59% of
       defaced systems ran Windows, 21% Linux, 8% Solaris, 6% BSD, and 6% all others
       in the period of August 1999 through December 2000. Thus, Windows systems
       have had nearly 3 times as many defacements as GNU/Linux systems. This would
       make sense if there were 3 times as many Windows systems, but no matter which
       figures you use, that's simply not true.
       Of course, not all sites are broken through their web server and OS - many
       are broken through exposed passwords, bad web application programming, and so
       on. But if this is so, why is there such a big difference in the number of
       defacements based on the OS? No doubt some other reasons could be put forward
       (this data only shows a correlation not a cause), but this certainly suggests
       that FLOSS can have better security.
       [331]Attrition.org has decided to abandon keeping track of this information
       due to the difficulty of keeping up with the sheer volume of broken sites,
       and it appeared that tracking this information wouldn't be possible. However,
       [332]defaced.alldas.de has decided to perform this valuable service. Their
       recent reports show that this trend has continued; on July 12, 2001, they
       report that 66.09% of defaced sites ran Windows, compared to 17.01% for
       GNU/Linux, out of 20,260 defaced websites.
    4. Red Hat Linux Enterprise did very well over a two-year period; the default
       install was only vulnerable to 3 critical flaws.
       [333]"Risk report: Two years of Red Hat Enterprise Linux 4" by Mark Cox
       examined Red Hat Enterprise Linux 4 AS from its release day, February 15,
       2005, through February 14, 2007. Over this time it released 289 security
       advisories, but this figure is very misleading, because that ignores severity
       and assumes a system has installed every available package (which is neither
       default nor likely). A default install of Enterprise Linux 4 AS was only
       vulnerable to 3 critical flaws. The system intentionally includes many
       mechanisms to prevent unknown vulnerabilities from being exploitable, or at
       least to reduce their impact. Mark Cox is not an independent observer; he is
       Director of the Red Hat Security Response Team, so keep that in mind. On the
       other hand, he's a technologist, not a marketer; he's developed software for
       many projects.
    5. Unpatched Linux systems last longer than unpatched Windows systems, according
       to a combination of studies from the Honeynet Project, AOL, and others. As
       [334]summarized in C|Net and [335]Vnunet, and described in more detail in
       [336]The Honeynet Project's report "Know Your Enemy: Trend Analysis" (17
       December 2004), as of 2004 the average Linux system lasts three months before
       being compromised, (a significant increase from the 72 hours life span of a
       Linux system in 2001). Unpatched Windows systems continue to be compromised
       far more quickly, sometimes within minutes. This data on Windows compromise
       is consistent with other studies. [337]Avantgarde found that Windows did not
       last long, and one unpatched Windows XP system (pre-SP2) only lasted 4
       minutes on the Internet before it was compromised. and in general did not
       last long (see also [338]USAToday's "Unprotected PCs can be hijacked in
       minutes", which worked with AvantGarde). Note, however, that users who
       install Windows Service Pack 2 have much less risk than previous versions of
       Windows. Symantec's Internet Security Threat Report (January 1-June 30,
       2004), The [339]Internet Storm Center's Survival Time History claims that by
       December 2004 a Windows survival time of 18 minutes.
       It could be argued that because there are so many Windows systems, attackers
       tend to focus on Windows. However, Apache shows that merely having the
       largest market share does not automatically make a system the most
       vulnerable. In any case, there are good reasons to reduce use of a system if
       it is so easily subverted, regardless of the reasons, if there is an
       alternative.
    6. In a [340]2008 contest where the first successful attacker got the computer
       and prize money, Vista and MacOS fell but Linux stayed up. You need to take
       these contests with grains of salt, but still, that is pretty interesting.
    7. The Bugtraq vulnerability database suggests that the least vulnerable OS is
       FLOSS, and that all the FLOSS OSes in its study were less vulnerable than
       Windows in 1999-2000, unless you counted every GNU/Linux vulnerability
       multiple times. One approach to examining security is to use a vulnerability
       database; an analysis of one database is the [341]Bugtraq Vulnerability
       Database Statistics page. As of September 17, 2000, here are the total number
       of vulnerabilities for some leading OSes:

                         OS               1997 1998 1999 2000
                         Debian GNU/Linux 2    2    30   20
                         OpenBSD          1    2    4    7
                         Red Hat Linux    5    10   41   40
                         Solaris          24   31   34   9
                         Windows NT/2000  4    7    99   85

       You shouldn't take these numbers very seriously. Some vulnerabilities are
       more important than others (some may provide little if exploited or only be
       vulnerable in unlikely circumstances), and some vulnerabilities are being
       actively exploited (while others have already been fixed before
       exploitation). FLOSS OSes tend to include many applications that are usually
       sold separately in proprietary systems (including Windows and Solaris). For
       example, Red Hat 7.1 includes two relational database systems, two word
       processors, two spreadsheet programs, two web servers, and many text editors.
       In addition, in the open source world, vulnerabilities are discussed
       publicly, so vulnerabilities may be identified for software still in
       development (e.g., "beta" software). Those with small market shares are
       likely to have less analysis. The "small market share" comment won't work
       with GNU/Linux, since GNU/Linux is the #1 or #2 server OS (depending on how
       you count them). Still, this clearly shows that the three FLOSS OSes listed
       (Debian GNU/Linux, OpenBSD, and Red Hat Linux) did much better by this
       measure than Windows in 1999 and (so far) in 2000. Even if a bizarre
       GNU/Linux distribution was created explicitly to duplicate all
       vulnerabilities present in any major GNU/Linux distribution, this
       intentionally bad GNU/Linux distribution would still do better than Windows
       (it would have 88 vulnerabilities in 1999, vs. 99 in Windows). The best
       results were for OpenBSD, an FLOSS OS that for years has been specifically
       focused on security. It could be argued that its smaller number of
       vulnerabilities is because of its rarer deployment, but the simplest
       explanation is that OpenBSD has focused strongly on security - and achieved
       it better than the rest.
       This data is partly of interest because various reporters make the same
       mistake: counting the same vulnerability multiple times. [342]One journalist,
       Fred Moody, failed to understand his data sources - he used these figures to
       try to show show that GNU/Linux had worse security. He took these numbers and
       then added the GNU/Linux ones so each Linux vulnerability was counted at
       least twice (once for every distribution it applied to plus one more). By
       using these nonsensical figures he declared that GNU/Linux was worse than
       anything. If you read his article, you also must read [343]the rebuttal by
       the manager of the Microsoft Focus Area at SecurityFocus to understand why
       the journalist's article was so wrong.
       In 2002, [344]another journalist (James Middleton) made the same mistake,
       apparently not learning from prior work. Middleton counted the same Linux
       vulnerability up to four times. What's bizarre is that he even reported the
       individual numbers showing that specific Linux systems were actually more
       secure by using Bugtraq's vulnerability list through August 2001, and somehow
       he didn't realize what it meant. He noted that Windows NT/2000 suffered 42
       vulnerabilities, while Mandrake Linux 7.2 (now Mandriva) notched up 33
       vulnerabilities, Red Hat Linux 7.0 suffered 28, Mandrake 7.1 had 27 and
       Debian 2.2 had 26. In short, all of the GNU/Linux distributions had
       significantly fewer vulnerabilities by this count. It's not fully clear what
       was being considered as being "in" the OS in this case, which makes a
       difference. There are some hints that vulnerabilities in some Windows-based
       products (such as Exchange) were not counted, while vulnerabilities in
       GNU/Linux products with the same functionality (e.g., sendmail) were counted.
       It also appears that many of the Windows attacks were more dangerous (which
       were often attacks that could be invoked by remote attackers and were
       actively exploited), as compared to the GNU/Linux ones (which were often
       attacks that could only be invoked by local users and were not actively
       exploited at the time). I would appreciate links to someone who's analyzed
       these issues more carefully. The funny thing is that given all these errors,
       the paper gives evidence that the GNU/Linux distributions were more secure.
       The [345]September 30, 2002 VNUnet.com article "Honeymoon over for Linux
       Users", claims that there are more "Linux bugs" than "Microsoft bugs". It
       quotes X-Force (the US-based monitoring group of security software firm
       Internet Security Systems), and summarizes by saying that in 2001 the centre
       found 149 bugs in Microsoft software compared to 309 for Linux, and in 2002
       485 Linux bugs were found compared to Microsoft's 202. However, [346]Linux
       Weekly News discovered and reported serious flaws in these figures:
         1. "Each distribution is counted independently. The same vulnerability in
            five distributions will count as five separate vulnerabilities. This
            practice drastically overstates the number of reported Linux problems.
         2. Linux vulnerabilities include those in applications (i.e. PostgreSQL)
            which are not part of a standard Windows system.
         3. Most Linux vulnerabilities are found through code audits and similar
            efforts; they are patched and reported before any exploits happen. Any
            Windows bugs found through similar audits are fixed silently and do not
            appear in these counts.
       Indeed, assuming that the vulnerabilities were only counted three times (and
       thus dividing by only 3) would show Linux as having a better result, never
       mind the fact that there are more than 3 Linux distributions and the other
       factors noted by Linux Weekly News.
       Indeed, as noted in Bruce Schneier's [347]Crypto-gram of September 15, 2000,
       vulnerabilities are affected by other things such as how many attackers
       exploit the vulnerability, the speed at which a fix is released by a vendor,
       and the speed at which they're applied by administrators. Nobody's system is
       invincible.
       A more recent analysis by John McCormick in Tech Republic compared Windows
       and Linux vulnerabilities using numbers through September 2001. This is an
       interesting analysis, showing that although Windows NT lead in the number of
       vulnerabilities in 2000, using the 2001 numbers through September 2001,
       Windows 2000 had moved to the "middle of the pack" (with some Linux systems
       having more, and others having fewer, vulnerabilities). However, it appears
       that in these numbers, bugs in Linux applications have been counted with
       Linux, while bugs in Windows applications haven't - and if that's so, this
       isn't really a fair comparison. As noted above, typical Linux distributions
       bundle many applications that are separately purchased from Microsoft.
    8. Red Hat (an FLOSS vendor) responded more rapidly than Microsoft or Sun to
       advisories; Sun had fewer advisories to respond to yet took the longest to
       respond. Another data point is that SecurityPortal has compiled a [348]list
       of the time it takes for vendors to respond to vulnerabilities. They
       concluded that:

     How did our contestants [fare]? Red Hat had the best score, with 348 recess
     days on 31 advisories, for an average of 11.23 days from bug to patch.
     Microsoft had 982 recess days on 61 advisories, averaging 16.10 days from bug
     to patch. Sun proved itself to be very slow, although having only 8 advisories
     it accumulated 716 recess days, a whopping three months to fix each bug on
     average.
       Their table of data for 1999 is as shown:

                                1999 Advisory Analysis
        Vendor   Total Days, Hacker Recess Total Advisories Recess Days/Advisory
       Red Hat              348                   31                       11.23
       Microsoft            982                   61                       16.10
       Sun                  716                   8                        89.50
       Clearly this table uses a different method for counting security problems
       than the prior table. Of the three noted here, Sun's Solaris had the fewest
       vulnerabilities, but it took by far the longest to fix security problems
       identified. Red Hat was the fastest at fixing security problems, and placed
       in the middle of these three in number of vulnerabilities. It's worth noting
       that the OpenBSD OS (which is FLOSS) had fewer reported vulnerabilities than
       all of these. Clearly, having a proprietary OS doesn't mean you're more
       secure - Microsoft had the largest number of security advisories, by far,
       using either counting method.
       More recent examples seem to confirm this; on September 30, 2002, [349]eWeek
       Labs' article "Open Source Quicker at Fixing Flaws" listed specific examples
       of more rapid response. This article can be paraphrased as follows: In June
       2002, a serious flaw was found in the Apache Web server; the Apache Software
       Foundation made a patch available two days after the Web server hole was
       announced. In September 2002, a flaw was announced in OpenSSL and a patch was
       available the same day. In contrast, a serious flaw was found in Windows XP
       that made it possible to delete files on a system using a URL; Microsoft
       quietly fixed this problem in Windows XP Service Pack 1 without notifying
       users of the problem. A more direct comparison can be seen in how Microsoft
       and the KDE Project responded to an SSL (Secure Sockets Layer) vulnerability
       that made the Internet Explorer and Konqueror browsers, respectively,
       potential tools for stealing data such as credit card information. The day
       the SSL vulnerability was announced, KDE provided a patch. Later that week,
       Microsoft posted a memo on its TechNet site basically downplaying the
       problem. The article [350]Linux Security Holes Opened and Closed makes the
       same argument: FLOSS systems fix problems more rapidly, reducing the time
       available for attackers to exploit them.
       In an August 18, 2004 interview, [351]Symantec's chief technology officer
       Robert Clyde argued that proprietary vendors were more reliable for fixing
       problems within a fixed timescale, and that he didn't know of a single vendor
       who would sit on a vulnerability. Yet the day before (August 17), an eWeek
       article revealed that [352]Oracle waited 8 months to fix a vulnerability. And
       Microsoft waited 9 months to fix a critical IE vulnerability (and only fixed
       it after it was being actively exploited in 2004). Proprietary vendors are
       certainly not winning prizes for reliably and rapidly fixing security
       vulnerabilities.
       This problem continues. [353]Gregg Keizer's November 19, 2007 article,
       "Microsoft DNS bug long known, familiar to researchers: Problem goes back at
       least a decade, say security pros" notes a delay of over 10 years in
       Microsoft's patches. In late 2007, Microsoft fixed a DNS cache poisoning bug
       in its Domain Name System (DNS) server included with Windows 2000 Server and
       Windows Server 2003. This was a "spoofing flaw that could be exploited by
       identity thieves or malware authors to silently redirect users from intended
       Web destinations to malicious pretenders". Both Trusteer and Scanit have
       exaplined that "the vulnerability is well known and has been extensively
       documented for more than a decade. 'It is saddening to realize that 10-15
       years after the dangers of predictable DNS transaction ID were discovered,
       still one of the most popular DNS cache servers does not incorporate strong
       transaction ID generation," said Amit Klein, Trusteer's chief technology
       officer". Scanit's Alla Bezroutchko cited research from 1997, 2002 and 2003
       on predictable DNS transaction IDs in Berkeley Internet Name Domain (BIND),
       showing that the problem is "common and well researched". In contrast, the
       developers of BIND (which is FLOSS) have been actively working to counter the
       attack.
    9. A 2002 survey of developers found that GNU/Linux systems are relatively
       immune from attacks from outsiders. Evans Data Corp.'s [354]Spring 2002 Linux
       Developer Survey surveyed over 400 GNU/Linux developers, and found that Linux
       systems are relatively immune from attacks from outsiders. Even though
       computer attacks have almost doubled annually since 1988 (according to CERT),
       78% of the respondents to the GNU/Linux developers survey have never
       experienced an unwanted intrusion and 94% have operated virus-free. Clearly,
       the survey shows that GNU/Linux "doesn't get broken into very often and is
       even less frequently targeted by viruses," according to Jeff Child (Evans
       Data Corp.'s Linux Analyst); and claims that "Linux systems are relatively
       immune from attacks from outsiders". Child notes that it's much harder to
       hack a knowledgeable owner's system (and most Linux developers have hands-on,
       technical knowledge) and that because there are fewer desktop GNU/Linux
       systems there are fewer viruses being created to attack GNU/Linux. The
       developers being surveyed attributed the low incidence of attacks to the Open
       Source Software (OSS) environment; "more than 84% of Linux developers believe
       that Linux is inherently more secure than software not created in an OSS
       environment," and they ranked "Linux's security roughly comparable in
       security to Solaris and AIX ... and above any of the Windows platforms by a
       significant margin".
   10. Apache has a better security record than Microsoft's IIS, as measured by
       reports of serious vulnerabilities. Eweek's July 20, 2001 article
       [355]"Apache avoids most security woes" examined security advisories dating
       back to Apache 1.0. They found that Apache's last serious security problem
       (one where remote attackers could run arbitrary code on the server) was
       announced in January 1997. A group of less serious problems (including a
       buffer overflow in the server's logresolve utility) was announced and fixed
       in January 1998 with Apache 1.2.5. In the three and a half years since then,
       Apache's only remote security problems have been a handful of
       denial-of-service and information leakage problems (where attackers can see
       files or directory listings they shouldn't).
       In contrast, in the article [356]"IT bugs out over IIS security," eWeek
       determined that Microsoft has issued [357]21 security bulletins for IIS from
       January 2000 through June 2001. Determining what this number means is a
       little difficult, and the article doesn't discuss these complexities, so I
       examined these bulletins to find their true significance. Not all of the
       bulletins have the same significance, so just stating that there were "21
       bulletins" doesn't give the whole picture. However, it's clear that several
       of these bulletins discuss dangerous vulnerabilities that allow an external
       user to gain control over the system. I count 5 bulletins on such highly
       dangerous vulnerabilities for IIS 5.0 (in the period from January 2000
       through June 2001), and prior to that time, I count 3 such bulletins for IIS
       4.0 (in the period of June 1998 through December 1999). Feel free to examine
       the bulletins yourself; they are MS01-033, MS01-026, MS01-025, MS01-023,
       MS00-086, MS99-025, MS99-019, and MS99-003. The [358]Code Red worm, for
       example, exploited a vast number of IIS sites through the vulnerabilities
       identified in the June 2001 security bulletin MS01-033.
       In short, by totaling the number of reports of dangerous vulnerabilities
       (that allow attackers to execute arbitrary code), I find a total of 8
       bulletins for IIS from June 1998 through June 2001, while Apache had zero
       such vulnerabilities for that time period. Apache's last such report was in
       January 1998, and that one affected the log analyzer not the web server
       itself. As was noted above, the last such dangerous vulnerability in Apache
       itself was announced in January 1997.
       It's time-consuming to do this kind of analysis, so I haven't repeated the
       effort more recently. However, it's worth noting [359]eWeek's April 10, 2002
       article noting that ten more IIS flaws have been found in IIS Server 4.0,
       5.0, and 5.1, some of which would allow attackers to crash the IIS service or
       allow the attacker to run whatever code he chooses.
       Even this doesn't give the full story, however; a vulnerability in IIS tends
       to be far more dangerous than an equivalent vulnerability in Apache, because
       Apache wisely follows the good security practice of "least privilege". IIS is
       designed so that anyone who takes over IIS can take over the whole system,
       performing actions such as reading, modifying, or erasing any file on the
       system. In contrast, Apache is installed with very few privileges by default,
       so even taking over Apache gives attackers relatively few privileges. For
       example, cracking Apache does not give attackers the right to modify or erase
       most files. This is still not good, of course, and an attacker may be able to
       find another vulnerability to give them unlimited access, but an Apache
       system presents more challenges to an attacker than IIS.
       The article claims there are four reasons for Apache's strong security, and
       three of these reasons are simply good security practices. Apache installs
       very few server extensions by default (a "minimalist" approach), all server
       components run as a non-privileged user (supporting "least privilege" as
       noted above), and all configuration settings are centralized (making it easy
       for administrators to know what's going on). However, the article also claims
       that one of the main reasons Apache is more secure than IIS is that its
       "source code for core server files is well-scrutinized," a task that is made
       much easier by being FLOSS, and it could be argued that FLOSS encourages the
       other good security practices.
       Simple vulnerability notice counts are an inadequate metric for security.
       That's particularly true for comparing proprietary and FLOSS software; A
       vendor could intentionally release fewer bulletins - but since Apache's code
       and its security is publicly discussed, it seems very unlikely that Apache is
       deliberately underreporting security vulnerabilities. Fewer vulnerability
       notices could result if the product isn't well scrutinized or is rarely used
       - but this simply isn't true for Apache. Even the trend line isn't
       encouraging - using the months of the bulletins (2/99, 6/99, 7/99, 11/00,
       three in 5/01, and 6/01), I find the time in months between new major IIS
       vulnerability announcements to be 4, 1, 18, 6, 0, 0, 1, and 3 as of September
       2001; this compares to 12 and 44 as of September 2001 for Apache. Given these
       trends, it looks like IIS's security is slowly improving, but it has little
       likelihood of meeting Apache's security in the near future. Indeed, these
       vulnerability counts are corroborated by other measures such as the web site
       defacement rates.
       Indeed, in 2007 [360]Microsoft admitted that it silently fixes multiple
       vulnerabilities in patches without revealing what the other vulnerabilities
       are, and that
       That means that Microsoft's vulnerability counts, as posted to the public,
       are significantly smaller than the real vulnerability counts. FLOSS, due to
       its open nature, often can't hide problems that way. What's worse, in
       [361]2007 Microsoft also admitted that Microsoft has left unpatched many more
       publicly-known vulnerabilities in Vista; Microsoft only patched 12 out of 27
       disclosed Vista vulnerabilities in the six months after it first shipped
       (November 2006), while during Windows XP's first six months, Microsoft's
       security team patched 36 out of 39.
       The issue here isn't whether or not a given program is invincible (what
       nonsense!) - the issue is which is more likely to resist future attacks,
       based on past performance. It's clear that the FLOSS Apache has much a better
       security record than the proprietary IIS, so much so that Gartner Group
       decided to make an unusual recommendation (described below).
   11. IIS was attacked 1,400 times more frequently than Apache in 2001, and Windows
       was attacked more than all versions of Unix. SecurityFocus co-founder and CEO
       Arthur Wong reported an analysis of the various vulnerabilities and attacks
       (based on SecurityFocus's data) in the February 2002 article [362]RSA:
       Security in 2002 worse than 2001, exec says. IIS was attacked 17 million
       times, but Apache was attacked only 12,000 times. This is a stunning
       comparison, since there are about twice as many Apache systems on the
       Internet. In 2001, Windows systems were attacked 31 million times, while Unix
       systems were attacked 22 million times. See the article for more information.
   12. The Gartner Group is recommending that businesses switch from Microsoft IIS
       to Apache or iPlanet due to IIS's poor security track record, noting that
       enterprises had spent $1.2 billion simply fixing Code Red (IIS-related)
       vulnerabilities by July 2001. Microsoft's IIS has such a bad security record
       that in September 2001, [363]Gartner Group announced a recommendation that
       "businesses hit by both Code Red and Nimda immediately investigate
       alternatives to IIS, including moving Web applications to Web server software
       from other vendors such as iPlanet and Apache. Although those Web servers
       have required some security patches, they have much better security records
       than IIS and are not under active attack by the vast number of virus and worm
       writers". Microsoft is sometimes a Gartner Group customer, so this
       announcement is especially surprising.
       In a [364]background document by Gartner, they discuss Code Red's impacts
       further. By July 2001, Computer Economics (a research firm) estimated that
       enterprises worldwide had spent $1.2 billion fixing vulnerabilities in their
       IT systems that Code Red could exploit (remember, Code Red is designed to
       only attack IIS systems; systems such as Apache are immune). To be fair,
       Gartner correctly noted that the problem is not just that IIS has
       vulnerabilities; part of the problem is that enterprises using IIS are not
       keeping their IT security up to date, and Gartner openly wondered why this
       was the case. However, Gartner also asked the question, "why do Microsoft's
       software products continue to provide easily exploited openings for such
       attacks?" This was prescient, since soon after this the "Nimba" attack
       surfaced which attacked IIS, Microsoft Outlook, and other Microsoft products.
       A brief aside is in order here. Microsoft spokesman Jim Desler tried to
       counter Gartner's recommendation, trying to label it as "extreme" and saying
       that "serious security vulnerabilities have been found in all Web server
       products and platforms.. this is an industry-wide challenge". While true,
       this isn't the whole truth. As Gartner points out, "IIS has a lot more
       security vulnerabilities than other products and requires more care and
       feeding". It makes sense to select the product with the best security track
       record, even if no product has a perfect record.
   13. Microsoft IIS is twice as likely to be serving malware features, according to
       a 2007 Google study. The study [365]Web Server Software and Malware found
       that "Microsoft IIS is twice as likely to be serving malware features twice
       as often (49% vs. 23%) as a malware distributing server". This is because
       Apache has a much larger market share, yet among servers with malicious
       software they have about the same market share. This does not necessarily
       mean that IIS is more vulnerable (though the data listed elsewhere does
       support that hypothesis), particularly because it varies by country. Instead,
       the authors of this study suspect that is caused because "automatic updates
       have not been enabled due to software piracy (piracy statistics from
       NationMaster, and BSA), and second, some security patches are not available
       for pirated copies of Microsoft operating systems. For instance the patch for
       a commonly seen ADODB.Stream exploit is not available to pirated copies of
       Windows operating systems". So in short, because it costs much more to buy
       Windows and IIS compared to Linux and Apache, and such unauthorized systems
       aren't able to get maintenance, many systems are not being properly
       maintained.
   14. The majority of the most serious security problems only apply to Microsoft's
       products, and not to FLOSS products, as suggested by the CERT/CC's "most
       frequent, high-impact types of security incidents and vulnerabilities" and
       the ICAT database. Some security vulnerabilities are more important than
       others, for a variety of reasons. Thus, some analysis centers try to
       determine what's "most important," and their results suggest that FLOSS just
       doesn't have as many vulnerabilities.
       The CERT Coordination Center (CERT/CC) is federally funded to study security
       vulnerabilities and perform related activities such as publishing security
       alerts. I sampled their list of [366]"current activity" of the most frequent,
       high-impact security incidents and vulnerabilities on September 24, 2001, and
       found yet more evidence that Microsoft's products have poor security compared
       to others (including FLOSS). Four of the six most important security
       vulnerabilities were specific to Microsoft: W32/Nimda, W32/Sircam, cache
       corruption on Microsoft DNS servers, and "Code Red" related activities. Only
       one of the six items primarily affected non-Microsoft products (a buffer
       overflow in telnetd); while this vulnerability is important, it's worth
       noting that many open source systems (such as Red Hat 7.1) normally don't
       enable this service (telnet) in the first place and thus are less likely to
       be vulnerable. The sixth item ("scans and probes") is a general note that
       there is a great deal of scanning and probing on the Internet, and that there
       are many potential vulnerabilities in all systems. Thus, 4 of 6 issues are
       high-impact vulnerabilities are specific to Microsoft, 1 of 6 are
       vulnerabilities primarily affecting Unix-like systems (including FLOSS OSes),
       and 1 of 6 is a general notice about scanning. Again, it's not that FLOSS
       products never have security vulnerabilities - but they seem to have fewer of
       them.
       The [367]ICAT system provides a searchable index and ranking for the
       vulnerabilities cross-references by CVE. I sampled its top ten list on
       December 19, 2001; this top ten list is defined by the number of requests
       made for a vulnerability in ICAT (and including only vulnerabilities within
       the last year). In this case, 8 of the top 10 vulnerabilities only affect
       proprietary systems (in all cases, Windows). Only 2 of 10 affect FLOSS
       systems (#6, CAN-2001-0001, a weakness in PHP-Nuke 4.4, and #8,
       CVE-2001-0013, a new vulnerability found in an old version of BIND - BIND 4).
       Obviously, by itself this doesn't prove that there are fewer serious
       vulnerabilities in FLOSS programs, but it is suggestive of it.
   15. An analysis of security reports by Nicholas Petreley found that a much larger
       percentage of Windows vulnerabilities are critical compared to Red Hat Linux.
       In October 2004, Nicholas Petreley's paper "Security Report: Windows vs
       Linux" (available in [368]HTML or [369]PDF) found that Windows
       vulnerabilities are far more likely to be serious than vulnerabilities in Red
       Hat Linux. He examined the 40 most recent patches/vulnerabilities listed for
       Microsoft Windows Server 2003 vs. Red Hat Enterprise Linux AS v.3, as
       reported by each vendor's website. He then used a metric to score their
       severity, and by that measure, 50% of the Windows vulnerabilities are
       critical, compared to 10% being critical in Red Hat.
       There's an interesting twist here; Microsoft claims that certain
       vulnerabilities aren't as serious as long as an administrator doesn't change
       certain settings. But as Petreley notes, "it is nearly inconceivable that
       anyone who uses Windows Server 2003 will leave the [Windows Server 2003]
       settings ... unchanged. These settings make the Internet Explorer browser
       nearly useless to the server administrator who wants to perform any
       browser-based administrative tasks, download updates, etc. To lower the
       severity rank based on the assumption that Windows Server 2003 users will
       leave these default settings as they are is a fantasy, at best". Also,
       Microsoft presumes that "Users" are never "Administrators", a very doubtful
       assumption on a Microsoft Windows server. If you accept these implausible
       claims, the percentage drops to 40%, which is still larger than Red Hat's.
       Microsoft assigns its own criticality levels (Red Hat doesn't), but even
       using Microsoft's reporting level things are worse; 38% of the patched
       programs are rated as Critical by Microsoft.
       He also did some analysis of the CERT database; while that analysis was more
       limited, that still suggested that Linux vulnerabilities tended to be less
       severe.
       The article goes on to argue against what it terms "myths". Petreley also
       argues that the reason for this difference is that Linux-based systems have a
       far better design for security than Windows systems. His design argument
       makes four statements: Linux-based systems are based on a long history of
       well fleshed-out multi-user design, they are modular by design (not
       monolithic), they are not constrained by an RPC model (that unnecessarily
       enables external control of internal functions), and Linux servers are
       ideally designed for headless non-local administration.
       This study didn't try to determine how many critical vulnerabilities there
       have been overall in the same period, which is a weakness of the study. And
       Petreley is certainly an advocate of GNU/Linux systems. Still, this report
       makes a plausible case that there is a difference in design and/or
       development process that makes GNU/Linux vulnerabilities less severe than
       Microsoft Windows vulnerabilies.
   16. Computer viruses are overwhelmingly more prevalent on Windows than any other
       system. Virus infection has been a major cost to users of Microsoft Windows.
       The LoveLetter virus alone is estimated to have cost $960 million in direct
       costs and $7.7 billion in lost productivity, and the anti-virus software
       industry sales total nearly $1 billion annually. Dr Nic Peeling and Dr Julian
       Satchell's [370]Analysis of the Impact of Open Source Software includes an
       analysis of the various data sources for virus counts, noting the
       disproportionate vulnerability of Windows systems. Here is what they said:

     The numbers differ in detail, but all sources agree that computer viruses are
     overwhelmingly more prevalent on Windows than any other system. There are
     about 60,000 viruses known for Windows, 40 or so for the Macintosh, about 5
     for commercial Unix versions, and perhaps 40 for Linux. Most of the Windows
     viruses are not important, but many hundreds have caused widespread damage.
     Two or three of the Macintosh viruses were widespread enough to be of
     importance. None of the Unix or Linux viruses became widespread - most were
     confined to the laboratory.
       Many have noted that one reason Windows is attacked more often is simply
       because there are so many Windows systems in use. Windows is an attractive
       target for virus writers simply because it is in such widespread use. For a
       virus to spread, it must transmit itself to other susceptible computers; on
       average, each infection must cause at least one more. The ubiquity of Windows
       machines makes it easier for this threshold to be reached.
       There may be a darker reason: there are many who do not like Microsoft's
       business practices, and perhaps this contributes to the problem. Some of
       Microsoft's business practices have been proven in court to be illegal, but
       the U.S. government appears unwilling to effectively punish or stop those
       practices. Some computer literate people may be taking their frustration out
       on users of Microsoft's product. This is absolutely wrong, and in most
       countries illegal. It is extremely unethical to attack an innocent user of a
       Microsoft product simply because of Microsoft's policies, and I condemn such
       behavior. At this point, although this has been speculated many times, I have
       not found any evidence that this is a widespread motivator for actual
       attacks. On the other hand, if you are choosing products, do you really want
       to choose the product whom people may have a vendetta against?
       However, the reasons given above don't explain the disproportionate
       vulnerability of Microsoft's products. A simpler explanation, and one that is
       easily proven, is that Microsoft has made many design choices over many years
       in their products that have rendered them fundamentally less secure, and this
       has made their products a much easier target than many other systems. Even
       [371]Microsoft's Craig Mundie admitted that their products were "less secure
       than they could have been" because they were "designing with features in mind
       rather than security" -- even though most people didn't use those new
       features. Examples include executing start-up macros in Word (even though
       users routinely view documents developed by untrustworthy sources), executing
       attachments in Outlook, and the lack of write protection on system
       directories in Windows 3.1/95/98. This may be because Microsoft has assumed
       in the past that customers will buy their products whether or not Microsoft
       secures them. After all, until recently there's been little competition, so
       there was no need to spend money on "invisible" attributes such as security.
       It's also possible that Microsoft is still trying to adjust to an
       Internet-based world; the Internet would not have developed as it has without
       Unix-like systems, which have supported the Internet standards for decades,
       while for many years Microsoft ignored the Internet and then suddenly had to
       play "catch-up" in the early 1990s. Microsoft has sometimes claimed that they
       can't secure their products because they want to ensure that their products
       are "easy to use". While it's true that some security features can make a
       product harder to use, usually a secured product can be just as easy to use
       if the security features are carefully designed into the product. Besides,
       what's so easy to use about a system that must be reformatted and reinstalled
       every few months because yet another virus got in? (This is a problem made
       worse because [372]Microsoft plans to require people to call Microsoft to
       gain permission simply to reinstall the operating system they bought.) But
       for whatever the reason, it's demonstrably true that Microsoft's designers
       have in the past made decisions that made their products' security much
       weaker than other systems. Microsoft has recently declared that they are
       working hard to improve their products' security; I have hopes that they will
       improve, and I see some encouraging signs, but it's like to take many years
       to really secure their products.
       In contrast, while it's possible to write a virus for FLOSS OSes, their
       design makes it more difficult for viruses to spread... showing that
       Microsoft's design decisions were not inevitable. It appears that FLOSS
       developers tend to select design choices that limit the damage of viruses,
       probably in part because their code is subject to public inspection and
       comment (and ridicule, if deserving of it). For example, FLOSS programs
       generally do not support attacker-controlled start-up macros, nor do they
       usually support easy execution of mail attachments from attackers. Also,
       leading FLOSS OSes (such as GNU/Linux and the *BSDs) have always had write
       protection on system directories, making it more difficult for certain
       attacks to spread. [373]Another discussion on why viruses don't seem to
       significantly affect FLOSS systems is available from Roaring Penguin. FLOSS
       systems are not immune to malicious code, but they are certainly more
       resistant.
   17. Surveys report that GNU/Linux systems experience fewer viruses and successful
       cracks. In July 2004, [374]Evans Data's Summer 2004 Linux Development Survey
       reported that 92% of their Linux systems have never been infected with a
       virus, and 78% that their Linux systems have never been cracked (called
       "hacked" in the report). This contrasts with their Spring 2004 survey, where
       only 40% non-Linux users reported no security breach; indeed, 32% non-Linux
       users experienced three or more breaches.
   18. According to a June 2004 study by Sandvine, 80% of all spam is sent by
       infected Windows PCs. [375]80% of all spam comes from computers contaminated
       with Trojan horse infections, according to a [376]study by network management
       firm Sandvine. Trojans and worms with backdoor components turn infected PCs
       into drones in vast networks of compromised zombie PCs.
       Sandvine identified subscribers bypassing their home mail servers and
       contacting many mail servers within a short period of time over sustained
       periods - i.e., spammers. It also looked at SMTP error messages returned to
       clarify the total volume of spam. They then compared this with the messages
       passing through the service provider's mail system.
       Sandvine's preliminary analysis has shown that the most active Trojans for
       spamming purposes are the Migmaf and SoBig variants; note that these are
       Windows-only attacks. Indeed, since almost all successful trojans and worms
       are those that attack Windows systems, it appears that this problem is
       essentially due to Windows systems.
   19. National Cyber Security Alliance's study of May 2003 reported that 91% of
       Broadband users have spyware on their home computers running proprietary
       operating systems; in contrast, there's no evidence of that this is an issue
       for FLOSS systems. America Online, Inc. conducted a study for the National
       Cyber Security Alliance. Its results, [377]"Fast and Present Danger: In-Home
       Study on Broadband Security among American Consumers" (May 2003) produces
       some interesting results, in particular, they found that "91% of Broadband
       Users Have Spyware Lurking on Home Computers". Their study method did not
       appear to permit collection of data from FLOSS systems, and spyware systems
       are essentially nonexistent on FLOSS systems anyway.
   20. Microsoft has had far more vulnerabilities than anyone else, according to
       SecurityTracker. The paper [378]SecurityTracker Statistics (March 2002)
       analyzes vulnerabilities from April 2001 through March 2002. They identified
       1595 vulnerability reports, covering 1175 products from 700 vendors. Their
       analysis found that Microsoft had more vulnerabilities than anyone else (187,
       or 11.7% of all vulnerabilities), and more than four times the next vendor.
       The next largest were Sun (42, 2.6% of the total), HP (40, 2.5%), and IBM
       (40, 2.5%). Solely FLOSS vendors did much better: the Apache Software
       Foundation had 13 (0.8% of the total), and Red Hat had 10 (0.6% of the
       total). It can be argued that Microsoft sells more kinds of software than
       most other vendors, but this is nevertheless an astonishingly large number of
       vulnerabilities. The gap between Microsoft and everyone else widened during
       the second half of the year, which is even scarier.
   21. In late June 2004, [379]the U.S. Department of Homeland Security's Computer
       Emergency Readiness Team (CERT) recommended using browsers other than
       Microsoft Corp.'s Internet Explorer (IE) for security reasons. Microsoft had
       failed to patch a critical vulnerability for 9 months, and IE was being
       actively exploited in horrendous ways. Customers then rushed to download
       Mozilla and Mozilla Firefox, popular FLOSS alternatives, to replace IE. This
       was a good idea, since 4 more serious IE vulnerabilities were soon admitted,
       and the technologically savvy began to switch in droves to FLOSS browsers.
       [380]The U.S. CERT warned that the Microsoft browser (IE) cannot protect
       against vulnerabilities, and there were dangerous active attacks exploiting
       them. A team of crackers (supposedly Russia-based) exploited Microsoft IE
       vulnerabilities by also exploiting other vulnerabilities in Microsoft's IIS.
       The crackers broke into IIS sites and inserted malicious code that IE users
       would download if they viewed an IIS site they'd broken into. The IE users
       who visited those sites (who legitimately trusted these sites) would have
       their IE program exploited, which then compromised their system. As a result,
       many IE users had keystroke information stolen from them. It's hoped the
       purpose was to steal credit card numbers, though passwords and other
       sensitive data could have been stolen too (e.g., to drain people's bank
       accounts or steal extremely private data). By June 25, 2004, this active
       attack was publicly known, but a fix to IE wasn't available until July 2, 7
       days later. Even worse, ZDNet found that [381]Microsoft had failed to fix
       this critical known IE vulnerability for nearly nine months. And even after a
       9-month lead time, ComputerWorld learned that the patch [382]doesn't address
       another closely related vulnerability.
       Nine months is a shamefully long time; 2-30 days is the expected time by most
       security practitioners, since every day a known exploit is unfixed is another
       day that attackers can exploit it, and attackers often know and exploit
       attacks that the vendor claims are secret. This is long after Microsoft
       loudly announced (in 2002) that it would pay much more attention to security;
       certainly in this case users were left unprotected for a long time. Even more
       tellingly, at the same time (June 28, 2004), [383]Microsoft's Bill Gates told
       Australians that while other operating system vendors took 90-100 days to
       release a security patch, Microsoft had this time "down to less than 48
       hours". Gates assured attendees that the Internet Explorer attack was new,
       but later analysis has shown otherwise. Clearly Microsoft admits that long
       delays in security patches are a bad thing, but it nevertheless still commits
       them.
       The U.S. CERT took the unusual step of noting that a useful solution would be
       to stop using IE and use another program instead. [384]SANS made a similar
       announcement, noting that one solution would be to stop using IE. FLOSS
       programs sometimes have vulnerabilities too, but it's rare that they last so
       long. More importantly, users of FLOSS programs can always fund to have a
       repair created and implemented quickly if it is important to them, and can
       have that fix reviewed and shared with others worldwide. Proprietary users
       have no such options; proprietary users are completely dependent on the
       proprietary vendor for making any emergency repairs, and for more reacting
       more responsibly than this. [385]Downloads of Mozilla and Mozilla's Firefox
       dramatically increased in late June 2004, presumably as a response to this
       serious problem in IE. Downloads of Mozilla and Firefox browsers hit an
       all-time high on July 1, 2004, from the usual 100,000 or so downloads on a
       normal day to more than 200,000 in one day. Mozilla argues that IE is in
       general less secure, in part because Microsoft's ActiveX technologies, IE's
       tight integration into the Microsoft operating system, and IE's weak default
       security settings make IE easier to exploit than its competition. [386]Even
       the U.S. CERT notes that IE includes many design decisions that make it an
       especially easy web browser to exploit; and all of them are true for IE and
       not problems for Firefox, except for the fact that both use graphical user
       interfaces. For example, [387]Semantic recommends that users consider
       disabling ActiveX altogether (see page 65), because of ActiveX's problems. In
       contrast, every change made to Mozilla applications is first peer reviewed by
       at least two engineers who are familiar with the code and overall
       architecture of the system before the new code is allowed into the product.
       The product then goes through automated tests and evaluations, and then
       Mozilla users and the development community are invited to review the impact
       of each change by downloading the test builds that are produced two or three
       times a day. All source code is available for review by anyone.
       This problem was so significant that it was noted in many different media and
       technology analysis sites. [388]USA Today noted in 2004 that "Using
       Microsoft's Internet Explorer Web browser to surf the Internet has become a
       marked risk -- even with the latest security patches installed". [389]The New
       York Times noted in 2004 that concerns about Internet Explorer's security
       vulnerabilities have dented its market share, and that the US CERT
       recommendation to consider other browsers was an unusual step. [390]The
       Inquirer reported that the "US Government warns against Internet Explorer",
       noting that the US Government's tone essentially pleaded for "users to stop
       using Microsoft's Internet Explorer". [391]Netcraft suggested that this may
       mean that the browser wars will recommence. Netcraft noted that one major
       difference is that this attack was different because of its extreme gravity:
       "victims of [these] attacks might conceivably lose their life savings. Some
       people now perceive Internet Explorer and Internet Banking as a potentially
       lethal cocktail that must not be mixed, with insiders in the banking industry
       urging their families to switch if not operating systems, then at least
       browsers, while conversely some Internet banking customers have adapted to
       the threat by forgoing convenience and moving funds back into accounts which
       require traditional telephone and fax instructions". Netcraft also noted that
       there is now "a serious alternative to Internet Explorer available on
       Windows" and that "this [combination of loss of confidence and a viable
       alternative] is an extremely dangerous situation for Microsoft. The phishing
       threats and the growing professional chorus of disapproval for Internet
       Explorer provide Windows users with very good reasons to turn elsewhere, even
       if only temporarily. But [FLOSS] Firefox is so good that many will want to
       stay with it. And once they have tasted the power and freedom of open source,
       maybe they will be tempted to try `just one more program'".
       Indeed, the security problems of IE have caused [392]IE to lose marketshare,
       ceding marketshare to FLOSS browsers.
       As if to prove the point of how differently security vulnerabilities are
       handled, a vulnerability was found soon after that affected Mozilla and
       Firefox when running on Windows (though it was actually another Windows
       vulnerability). [393]In contrast with IE, the security fix was delivered
       extremely rapidly. The initial notice of this vulnerability was on July 7, it
       was fixed the same day, and the configuration change was released to all in
       one day - with no known compromises to any system. The Mozilla project has
       [394]more information about the security issue, and you can even read the
       [395]detailed discussions between the finders and developers. What's
       especially interesting is that it's not even a vulnerability in the FLOSS
       programs; it's a vulnerability in Windows itself. The problem is the Windows
       maintains a registry of secure programs that accept URLs, but the list
       provided by Microsoft includes an application known to be insecure (the
       shell: URL). Windows XP Service Pack 1 was supposed to have closed this hole,
       but it didn't. Thus, the Mozilla project had to create a patch to compensate
       for Windows' insecurity, but explicitly disabling it on Windows. It appears
       [396]that other Microsoft products, such as MSN Messenger and Word, are
       affected by this vulnerability in Windows. And it appears that Mozilla is
       continuing to be proactive in its security; they have [397]already added new
       features to make attacks against the browser even more difficult.
       After all that, on July 13, 2004, [398]Secunia reported four more extremely
       critical vulnerabilities in IE. The only solutions at the time were to
       disable active scripting or use another product. It's unlikely that these
       additional vulnerabilities will improve IE's reputation. All of this has
       convinced me; in my [399]essay on how to secure Microsoft Windows (for home
       and small business users), I suggest switching from IE to Firefox, and from
       Outlook to something else; too many people (both myself and others) have
       observed that simply replacing these two programs greatly reduces the number
       of security problems in the real world.
   22. According to Symantec Corp., Mozilla Firefox fixed its vulnerabilities
       faster, and had fewer severe vulnerabilities (though more total
       vulnerabilities), in the July - December 2004 period than Internet Explorer.
       [400]Symantec Internet Security Threat Report, Volume VII (released March
       2005), found that Internet Explorer had 9 highly severe vulnerabilities
       affecting it in the time period, while Firefox had 7. In addition, the
       Internet Explorer flaws also took longer to fix -- an average of 43 days,
       compared to 26 days for Mozilla browsers (which presumably includes Firefox).
       In all previous reports, the total number of Mozilla vulnerabilies was lower
       than IE. The bad news is that this March 2005 report reports that in this
       period there were more total vulnerabilities (though fewer high severity
       ones) in Mozilla-based browsers than in IE. There are 13 vulnerabilities
       affecting Internet Explorer, compared to 21 vulnerabilities affecting the
       Mozilla and Mozilla Firefox browsers during the survey period. It's difficult
       to tease out what the issue is, unfortunately. Symantec was encouraged that
       the security vulnerabilities, where found in Firefox, were at least less
       likely to be of high severity. The good (?) news is that attackers were only
       exploiting the IE vulnerabilities, not the Mozilla/Firefox ones, in the time
       period.
       CNet reported in an article about Symtatec's later September 2005 report that
       [401]Mozilla browsers were more vulnerable than IE -- yet once all
       information is taken into account, IE was more vulnerable. This latest study
       found that 25 vendor-confirmed vulnerabilities were disclosed for the Mozilla
       browsers during the first half of 2005 (18 were high severity); during the
       same period, 13 vendor-confirmed vulnerabilities were disclosed for IE (eight
       were high severity).
       But wait -- there was a major caveat that made the headline misleading.
       Symantec only counted the security flaws that have been confirmed by the
       vendor; vulnerabilities that are known to the public, but not acknowledged by
       the vendor, aren't counted. CNet examined data from security monitoring
       company Secunia to see what that meant, and found that there are 19 security
       issues that Microsoft still has to deal with for Internet Explorer, while
       there are only three for Firefox.
       Internet Explorer is definitely not better than Mozilla-based browsers once
       you include the vulnerabilities the vendor has not yet fixed. IE has a a
       total of 32 known vulnerabilities (13+19) compared to 28 (25+3)
       vulnerabilities over that period. That's pretty close, so in terms of known
       vulnerabilities over that period I'd call that a tie. [402]Mozilla also noted
       that IE tended to have more serious vulnerabilities. What's even more
       concerning, though, is that Internet Explorer has more unpatched
       vulnerabilities (13 vs. 3). And while they claim both now have similar
       response times (6 days) it's not clear how that could be true. (Especially
       when you only consider the ones that are publicly announced first; clearly,
       it's easy to have a patch immediately if you only publicly announce the
       vulnerability with the patch, but sometimes vulnerabilies publicly announced
       when a patch is not available.) CNet themselves note that Microsoft generally
       releases patches only on a monthly basis, which is more than 6 days. Even
       more importantly, since IE has many more unaddressed vulnerabilities compared
       to in Mozilla, IE's average response times would increase more rapidly too
       (making "equality" only make sense when you ignore the unpatched
       vulnerabilities).
   23. More recent summaries as of August 2005 suggest Internet Explorer is still
       more dangerous than the FLOSS Firefox. David Hammond's [403]Internet Explorer
       is dangerous examined the Secunia reports on Internet Explorer, Firefox, and
       Opera, as of August 4, 2005. Here is his summary (my credits to him):

                Feature           Internet Explorer Firefox Opera
       Historical quantity        43                21      23
       Present quantity           25                4       0
       Historical relative danger 121               56      59
       Present relative danger    50                9       0
       The "quantity" shows the number of vulnerabilities, but doesn't account for
       their criticality. Thus, he also computes a "relative danger" by simply
       "adding up the criticality levels for each vulnerability (not critical=1,
       extremely critical=5)". As of that date:
          + "Internet Explorer has had 43 reported vulnerabilities. 7 were marked as
            moderately critical, 11 were marked as highly critical, and 6 were
            marked as extremely critical. There are still 25 unfixed
            vulnerabilities, including 6 that were marked as moderately critical, 1
            that was marked as highly critical, and 1 that was marked as extremely
            critical".
          + "Mozilla Firefox has had 21 reported vulnerabilities. 8 were marked as
            moderately critical, 4 were marked as highly critical, and 0 were marked
            as extremely critical. There are still 4 unfixed vulnerabilities,
            including 1 that was marked as moderately critical".
          + "Opera has had 23 reported vulnerabilities. 14 were marked as moderately
            critical, 0 were marked as highly critical, and 0 were marked as
            extremely critical. All reported vulnerabilities have since been fixed".
       Obviously, this doesn't show that FLOSS is always better than proprietary; by
       these measures, the proprietary Opera is even better. But it does clearly
       suggest that FLOSS can do very well, beating at least some competitors.
   24. Statistics by Scanit's Browser Security Test group found that 98% of time in
       2004 Internet Explorer was vulnerable to dangerous known remote attacks, for
       which no patch to fix it was available, compared to 17% for Opera and 15% for
       Mozilla/Firefox. There were only 7 days in 2004 that Internet Explorer was
       safe from known yet unstoppable remote attacks. The paper [404]A Year Of Bugs
       by [405]scanIT's Browser Security Test examined the life spans of
       vulnerabilities during 2004 for three popular browsers: Microsoft's Internet
       Explorer, Mozilla-based browsers (including Firefox and Netscape), and Opera.
       Since not all vulnerabilities are equal, they only considered the especially
       dangerous "remote code execution" vulnerabilities, i.e., defects that allow a
       "malicious web page or e-mail message to execute arbitrary code or OS
       commands on the viewer's computer". They then compared the time from the
       "public announcement of the vulnerability to the time when the fix is
       available to the general user population". The results were disturbing, if
       you use Internet Explorer:
          + For Internet Explorer, "there was only one period in 2004 when there
            were no publicly known remote code execution bugs - between the 12th and
            the 19th of October - 7 days in total". That means that someone who
            diligently kept their installation patched every day of the year was
            still known to be vulnerable 98% of the time in 2004. The excuse "well,
            it wasn't exploitable" doesn't work, either; they found that for "200
            days (that is, 54% of the time) there was a [known] worm or virus in the
            wild exploiting one of those unpatched vulnerabilities". And that's just
            the known mass attacks in the wild; it's probably foolish to presume
            that those were the only attacks. Frankly, 2004 was a disturbing year
            for IE; at the beginning of the year there were two known unpatched
            vulnerabilities, and 2004 ended with an "unpatched HTML Help ActiveX
            control vulnerability and [the worm] Trojan.Phel using it to install a
            backdoor".
          + In 2004 Opera had publicly known unpatched remote code execution
            vulnerabilities for 65 days (17%). It could have been worse, but two
            different "unpatched periods" happened to intersect, so it actually
            faired better by this measure than it might have otherwise.
          + Mozilla and the family (including Firefox, Netscape Navigator and the
            Camino browsers) has the shortest attack window of opportunity. There
            were 56 days (15%) in 2004 when there was a publicly known remote code
            execution vulnerability with no publicly-available patch, and about half
            of that 15% only applied to MacOS users. There was a 30 day period in
            May-June for an attack that only affected MacOS users, one day in July
            for a "shell: protocol" vulnerability (with a very rapid fix), one day
            in August for a libPNG vulnerability, and 24 days in October-November
            for problems problems found by Michal Zalewski's mangleme program. Note
            that in several cases, the time between the report and fix was one day
            or less. At no time were any vulnerabilities being actively exploited,
            as far as anyone knows.
   25. Security Fix that 78% (284/365) of the time in 2006 Internet Explorer was
       vulnerable to dangerous known attacks, for which no patch to fix it was
       available, compared to 2% (9/365) for Mozilla Firefox. [406]Brian Krebs
       "Security Fix" column compiled statistics on vulnerability response times,
       including those for Microsoft Internet Explorer (IE) and Mozilla Firefox. He
       found that for "a total 284 days in 2006 (or more than nine months out of the
       year), exploit code for known, unpatched critical flaws in pre-IE7 versions
       of the browser was publicly available on the Internet. Likewise, there were
       at least 98 days last year in which no software fixes from Microsoft were
       available to fix IE flaws that criminals were actively using to steal
       personal and financial data from users... In a total of ten cases last year,
       instructions detailing how to leverage "critical" vulnerabilities in IE were
       published online before Microsoft had a patch to fix them. Microsoft labels
       software vulnerabilities `critical' -- its most severe rating -- if the flaws
       could be exploited to criminal advantage without any action on the part of
       the user, or by merely convincing an IE user to click on a link, visit a
       malicious Web site, or open a specially crafted e-mail or e-mail attachment.
       In contrast, Internet Explorer's closest competitor in terms of market share
       -- Mozilla's Firefox browser -- experienced a single period lasting just nine
       days last year in which exploit code for a serious security hole was posted
       online before Mozilla shipped a patch to remedy the problem..". He also notes
       that in several cases the attacks (from organized crime) were so severe, and
       Microsoft was so late in producing patches, that third-party security patches
       were released with many recommending their use.
   26. Internet Explorer (IE) users are far more likely to end up with a
       spyware-infected PC than Mozilla's Firefox users. If the user always says
       "yes" to security queries, unpatched IE was infected by 1.6% of domains while
       unpatched Firefox was experienced 0.09%. If the user always says "no", IE was
       infected by 0.6% while Firefox experienced 0% (no infections). In TechWeb.com
       (February 9, 2005), [407]Gregg Keizer's article "Spyware Barely Touches
       Firefox" describes some research work from the University of Washington.
       Henry Levy stated that his research showed that users "will have a safer
       experience [surfing] with Firefox". Researchers Henry Levy and Steven Gribble
       crawled 45,000 websites, cataloguing their executable files, and then exposed
       unpatched Internet Explorer (IE) and Firefox browsers to them. They also
       observed if running the program required a user to actively agree (a practice
       naive users often unfortunately do) or if the program could install and run
       without being permitted to do so. During their most recent crawl on October
       2005, 1.6% of the domains infected the first IE configuration that always
       permitted executing programs; and 0.6% planted spyware even when the user
       rejected the program. In contrast, only 0.09% of domains infected Firefox
       when the user permitted it, and 0% (no) domain managed to infect Firefox
       without permission. A startling result of the research was the number of
       spyware sites; about 5% of all executable files on web sites are spyware, and
       "1 in 25 domains contain at least one piece of spyware waiting for victims".
       Levy said: "If you browse, you're eventually going to get hit with a spyware
       attack". Perhaps choosing the program with the better record would help.
       Obviously, you should patch your browser when there's a security patch. But
       next, we'll see statistics that make you worry about that too.
   27. Proprietary vendor Microsoft took three times as long (on average) to fix
       critical flaws in its Windows software than FLOSS Mozilla took to fix
       critical flaws in its software, according to analysis by Brian Krebs.
       Microsoft took 134 days on average to release patches for security problems
       in 2004-2005; Mozilla averaged 37 days. [408]Brian Krebs' "A Time to Patch
       II: Mozilla" compared patch times of Mozilla with Microsoft. Even with an
       outlier included, Mozilla did much better on average than Microsoft. Mozilla
       took an average of about 37 days to issue patches for critical security
       problems in its products over a 3-year period. In general it did much better;
       one-third of its critical security updates were within less than 10 days of
       being notified. (The longest time was for a bug that perhaps should not have
       been marked as "critical"; Microsoft had exactly the same bug but marked it
       only as "moderate.)
       In a [409]similar study of Microsoft's vulnerability report response times,
       he notes that "In 2003, Microsoft took an average of three months to issue
       patches for problems reported to them. In 2004, that time frame shot up to
       134.5 days, a number that remained virtually unchanged in 2005". This is an
       extraordinarily long time; such a lengthy time may convince vulnerability
       reporters that Microsoft doesn't take vulnerability reports seriously. It's
       certainly true that many more people report vulnerabilities quietly to
       Mozilla than to Microsoft; instead, people often report vulnerabilities
       publicly (the "full disclosure" method). Many advocates of full disclosure
       say that they do it because companies often ignore vulnerability reports
       until they're made public, so do it publicly to start with. The data
       certainly proves that Microsoft does fix problems released under full
       disclosure more quickly. In 2003, it took an average of 71 days to release a
       fix for a flaws reported under "full disclosure"; in 2004 it decreased to 55
       days, and in 2005 it shrank further to 46 days. Note that this 46 day value
       is still longer than the average Mozilla repair time for reports that were
       usually private.
       It may be that security researchers trust that Mozilla will usually respond
       quickly to private vulnerability reports -- with good reason, given their
       typical response times. And in contrast, they may not trust Microsoft to
       respond quickly to private vulnerability reports -- and unfortunately, the
       data suggests that they have reason to believe that.
   28. FLOSS suppliers are 60% faster than proprietary suppliers at responding to
       vulnerability reports The analysis paper [410]Empirical Analysis of Software
       Vendors' Patching Behavior: Impact of Vulnerability Disclosure examined the
       behavior of 325 vendors and 438 unique vulnerabilities. Their primary
       interest was in the whether or not publicly announcing a vulnerability sped
       up its repair (it does). However, they also compared FLOSS suppliers to
       proprietary suppliers, and found that the FLOSS suppliers were 60% faster
       than the proprietary ones.
   29. According to a Network Security evaluation, an FLOSS vulnerability scanner
       (Nessus) was found to be the best (most effective). On January 8, 2001,
       Network Computing's article [411]Vulnerability Assessment Scanners. reported
       an evaluation of nine network scanning tools, most of them proprietary. In
       their evaluation, Network Computing set up demonstration systems with 17 of
       the most common and critical vulnerabilities; they then used the various
       network scanning tools to see how effectively each of the tools detected
       these vulnerabilities. Sadly, not one product detected all vulnerabilities;
       the best scanner was the FLOSS program Nessus Security Scanner, which found
       15 of the 17 (which also received their top total score); the next best was a
       proprietary scanner which only found 13.5 out of 17.
       In their words,

     Some of us were a bit skeptical of the open-source Nessus project's
     thoroughness until [Nessus] discovered the greatest number of vulnerabilities.
     That's a hard fact to argue with, and we are now eating our words ... [Nessus]
     got the highest overall score simply because it did more things right than the
     other products.
       I agree with the authors that ideally a network vulnerability scanner should
       find every well-known vulnerability, and that "even one hole is too many".
       Still, perfection is rare in the real world. More importantly, a
       vulnerability scanner should only be part of the process to secure an
       organization - it shouldn't be the sole activity. Still, this evaluation
       suggests that an organization will be more secure, not less secure, by using
       an FLOSS program. It could be argued that this simply shows that this FLOSS
       program had more functionality - not more security - but in this case, the
       product's sole functionality was to improve security.
   30. Information Systems Journal (a peer-reviewed journal) published researcher
       Christian Payne's results, showing good evidence that FLOSS can be secure.
       Information Systems Journal, Vol.12, Issue 1, February 2002, includes the
       peer-reviewed paper [412]"On the security of open source software" by
       Christian Payne of Murdoch University (Perth, Australia). In it, Payne first
       summarizes the various arguments made for and against open source software.
       He discusses some of the arguments that FLOSS is more secure, in particular,
       claims that the process of peer review improves security, FLOSS flexibility
       and freedom is a significant aid (e.g., organizations are free to audit
       FLOSS, modify it to meet their security needs, and rapidly patch FLOSS
       without having to wait for a vendor), and that FLOSS projects tend to respond
       more quickly with security fixes. He also discusses some of the arguments
       made against FLOSS, such as claims that that vulnerabilities are harder for
       attackers to find in proprietary programs (since the source code is not
       available), and that there are flaws in the peer review argument (e.g., it
       may be available but not necessarily reviewed). In short, there are different
       effects, and it's easy to have opinions about the strengths of those
       different effects. Without measurement, it's hard to know what effects are
       stronger.
       But Payne goes beyond a mere summary of arguments, and actually works to try
       to gather quantitative data to measure the effect of these alternative
       approaches. Payne devised a scoring system for measuring security features,
       measuring reported security vulnerabilities, and then rolling those two
       factors into a final score. He then applied this to two FLOSS systems (Debian
       and OpenBSD) and one proprietary system (Solaris, which at the time was
       proprietary); all are Unix-based operating systems. The following table
       summarizes the results:

                                              Debian Solaris OpenBSD
                      Number of Features:     15     11      18
                        Features score:       6.42   5.92    7.03
                   Number of Vulnerabilities: 12     21      5
                     Vulnerabilities score:   7.72   7.74    4.19
                          Final Score:        -1.0   -3.5    10.2

       OpenBSD had the most security features (features that support
       confidentiality, integrity, availability, or audit), with Debian second and
       Solaris third. OpenBSD also had the highest score for those features. In
       terms of vulnerabilities, OpenBSD had the fewest reported vulnerabilities,
       and those vulnerabilities "were also relatively minor[,] only rating an
       average of 4.19 out of 10". Solaris, the proprietary system, had the largest
       number of vulnerabilities. The final rolled-up score is quite intriguing: of
       the three systems, the proprietary system had the worst security by this
       rolled-up measure.
       The author correctly notes that these are only a few systems, using
       information taken at only one point in time, so these results are "far from
       being final". And the author certainly does not take the view that any FLOSS
       program is automatically more secure than any proprietary alternative. Still,
       this data suggests that FLOSS programs can be more secure than their
       competing proprietary products. Hiding the source code certainly did not
       reduce the number of reported vulnerabilities, contrary to some proprietary
       vendors' claims; the proprietary system had the most vulnerabilities reported
       about it. OpenBSD has far better score than either of the other systems; the
       author believes this is because of OpenBSD's focused code audits by
       developers with the necessary background and security expertise.
       A BZ Research survey of 6,344 software development managers shows Linux
       superior to Windows for operating system security attacks, and FLOSS was in
       most categories considered equal or better at the application layer. [413]A
       BZ Research survey of 6,344 software development managers reported in April
       2005 asked about the security of different popular enterprise operating
       environments; FLOSS did very well. Below are some of the results; the margin
       of error for the survey is 2.5 percentage points.
       Among server operating systems, there was uniform agreement that both Sun
       Solaris and Linux were much more secure than Microsoft's Windows Server
       against operating system related attacks. When comparing Sun Solaris against
       Linux by this measure, There was no consensus as to whether Sun Solaris or
       Linux were better against operating system level attacks; more people ranked
       Linux as "secure or very secure" compared to Sun Solaris, yet more people
       also ranked Linux as "very insecure or insecure" than Sun Solaris. One
       complication (for this paper's purpose) is that Sun Solaris was originally
       built in large part from FLOSS approaches, then made proprietary for a time,
       and more recently released as FLOSS, so it's difficult to cleanly take
       lessons from its Solaris results for either FLOSS or proprietary approaches.

                                       MS Windows Server Linux Sun Solaris
            Very insecure or Insecure: 58%               6%    13%
              Secure or very secure:   38%               74%   66%

       Windows Server also did poorly against application-related "hacks and
       exploits":

                                             MS Windows Server Linux
                  Very insecure or Insecure: 58%               18%
                    Secure or very secure:   30%               66%

       FLOSS was also far ahead of proprietary programs in in 4 of the 8 categories
       they considered: desktop/client operating systems (44% to 17%), Web servers
       (43% to 14%), server operating systems (38% to 22%), and components and
       libraries (34% to 18%). Results were essentially equal in three categories:
       desktop/client applications, server applications and application servers.
       Only in one area was proprietary software considered more secure than FLOSS,
       database servers (34% to 21%).
       Note that this is merely a survey of opinions. Opinions can, of course, be
       quite wrong; measurements of products are often better than measurements of
       opinions. Still, opinion polls of large numbers of people who would have
       every reason to know the facts should not be ignored.

   Security is notoriously hard to measure, and many reports that attempt to do so
   end up with interesting information that's hard to interpret or use. And some
   reports come from sources whose reliability is widely questioned. On November 2,
   2004, [414]mi2g reported on successful digital breaches against permanently
   connected computers worldwide. They concluded that BSDs (which are usually FLOSS)
   and Apple's computers had the fewest security breaches; on the surface, that
   sounds positive for FLOSS. They also reported that GNU/Linux systems had the most
   breaches, followed by Windows. That result sounds mixed, but digging deeper it
   turns out that this ranking is artificial, based on artificial definitions. Their
   default definition for a security breach only included manual attacks and ignored
   malware (viruses, worms, and Trojans). Yet malware is one of the dominant
   security problems for Windows users, and only Windows users! After all, why
   bother with a manual attack when completely automated attacks against broad
   collections of computers will do more? When they include malware in their
   calculations for all system breaches, "including the impact of MyDoom, NetSky,
   SoBig, Klez and Sasser, Windows has become the most breached computing
   environment in the world accounting for most of the productivity losses
   associated with malware - virus, worm and trojan - proliferation". Even without
   malware, in governments "the most breached Operating System for online systems
   has now become Windows (57.74%) followed by Linux (31.76%) and then BSD and Mac
   OS X together (1.74%)" (a reversal of their previous rankings). But while these
   results are interesting, there are significant problems in interpreting what
   these results actually mean:
    1. Ignoring malware in the main report is hard to justify, though to be fair the
       report does clearly state this assumption and explains how the results would
       change with a different definition. But most users want to be protected from
       all attacks, automated or not, and it's especially hard to justify this
       assumption since malware is a leading attack on only one of the systems.
    2. None of these statistics, at least what's publicly posted, seem to take
       market share into account, or control sampling in general. If 2 of 100 type A
       machines are broken into, and 1 of 1 type B machines are broken into, type A
       may have twice as many break-ins, but that's irrelevant to most users; what's
       more interesting is noticing that 98% of the type A machines were unbreached,
       while 0% of the type B machines were unbreached! Besides, what you really
       want to know is not raw numbers like this, but the probability that a given
       system will be breached (given various criteria such security configuration
       and as if you're relatively up-to-date on patches). That information doesn't
       appear to be available from the public information provided.

   Checking the source (mi2g) yields decidedly mixed reports, too. mi2g clearly
   states that it has no financial interest in Apple. I always search for financial
   links in research reports, and that's a good sign at least. However, [415]The
   Register, [416]the full disclosure mailing list, [417]attrition.org, [418]Vmyths,
   and [419]Yahoo! News provide a number of troubling reports about the quality and
   validity of mi2g's reports. Many of these reports suggest that these figures are
   made up, and cannot be relied on at all. Hopefully in the future I can gain a
   better understanding of the situation; I know nothing more than what I reference
   above. But for now, I'm mentioning both sides (mi2g's results and the concerns
   many number of people have raised about them), so that those who have heard about
   these results will know about the controversies and limitations surrounding this
   data. I'm not including mi2g results in my major list of studies, given the
   limitations and current questions surrounding them.

   One serious problem in making secure software is that there are strong economic
   disincentives for proprietary vendors to make their software secure. For example,
   if vendors make their software more secure, they would often fail to be "first"
   in a given market; this often means that they will lose that market. Since it is
   extremely difficult for customers to distinguish proprietary software with strong
   security from those with poor security, the poor products tend to eliminate the
   good ones (after all, they're cheaper to develop and thus cost less). Governments
   have other disincentives as well. For a discussion of some of the economic
   disincentives for secure software, see [420]Why Information Security is Hard - an
   Economic Perspective by Ross Anderson (Proceedings of the Annual Computer
   Security Applications Conference (ACSAC), December 2001, pp. 358-365). It's not
   clear that FLOSS always avoids these disincentives, but it appears in at least
   some cases it does. For example, FLOSS source code is public, so the difference
   in security is far more visible than in proprietary products.

   One of the most dangerous security problems with proprietary software is that if
   intentionally malicious code is snuck into it, such code is extremely difficult
   to find. Few proprietary vendors have other developers examine all code in great
   detail - their testing processes are designed to catch mistakes (not malice) and
   often don't look at the code at all. In contrast, malicious code can be found by
   anyone when the source code is publicly available, and with FLOSS, there are
   incentives for arbitrary people to review it (such as to add new features or
   perform a security review of a product they intend to use). Thus, someone
   inserting malicious code to an FLOSS project runs a far greater risk of
   detection. Here are two examples, one confirmed, one not confirmed:
    1. Some time between 1992 and 1994, Borland inserted an intentional "back door"
       into their database server, "InterBase", as a secret username and fixed
       password. This back door allowed any local or remote user to manipulate any
       database object and install arbitrary programs, and in some cases could lead
       to controlling the machine as "root". This vulnerability stayed in the
       product for at least 6 years - no one else could review the product, and
       Borland had no incentive to remove the vulnerability. Then Borland released
       its source code on July 2000 as an FLOSS project. The "Firebird" project
       began working with the source code, and uncovered this serious security
       problem with InterBase in December 2000 (only 5 months after release). By
       January 2001 the CERT announced the existence of this back door as CERT
       advisory CA-2001-01. What's discouraging is that the backdoor can be easily
       found simply by looking at an ASCII dump of the program (a common cracker
       trick), so it's quite possible that this vulnerability was exploited many
       times in the intervening years. Once this problem was found by open source
       developers reviewing the code, it was patched quickly.
    2. Mohammad Afroze Abdul Razzak, arrested by Mumbai (Bombay) police Oct. 2,
       2001, claims that [421]Osama bin Laden's Al Qaeda network were able to gain
       employment at Microsoft and attempted to plant "trojans, trapdoors, and bugs
       in Windows XP". This was reported to Ravi Visvesvaraya Prasad, a New Delhi
       information systems and telecommunication consultant, and then reported by
       the [422]Washington Post's Newsbytes division. This claim has not been
       confirmed; indeed, I'm somewhat skeptical. The problem, however, is that this
       is impossible to disprove. Even if this particular case isn't true, note that
       this threat is unfortunately a credible threat to proprietary software,
       because very few of its users can review the code. This is far less dangerous
       to FLOSS software, due to the worldwide review that's possible (including the
       ability to see the changes made in each version).

   [423]Bruce Perens, in "Open sourcers wear the white hats", makes the interesting
   claim that most of the people reviewing proprietary products looking for security
   flaws (aside from one or two paid reviewers) are "black hats," outsiders who
   disassemble the code or try various types of invalid input in search of a flaw
   that they can exploit (and not report). There is simply little incentive, and
   many roadblocks, for someone to search for security flaws simply to improve
   someone else's proprietary product. "Only a black hat would disassemble code to
   look for security flaws. You won't get any `white hats' doing this for the
   purpose of [just] closing the flaws". In contrast, he thinks many open source
   developers do have such an incentive. This article slightly overstates the case;
   there are other incentives (such as fame) that can motivate a few people to
   review some other company's proprietary product for security. Still, it has a
   point; even formal reviews often only look at designs (not code), proprietary
   code is often either unreviewed or poorly reviewed, and there are many cases
   (including the entire OpenBSD system) where legions of developers review open
   source code for security issues. As he notes, "open source has a lot of `white
   hats' looking at the source. They often do find security bugs while working on
   other aspects of the code, and the bugs are reported and closed".

   Those who are familiar with computer security issues may raise an objection: what
   about the "Trusting Trust" attack? An Air Force evaluation by Karger and Schell
   first publicly described this very nasty computer attack, which Ken Thompson ably
   demonstrated and described in his classic 1984 paper [424]"Reflections on
   Trusting Trust". Thompson showed that because we use software to create other
   software, if an attacker subverts the software-creating programs, no amount of
   auditing any program can help you - the subverted programs can hide whatever they
   want to! This has been called the "uncounterable attack", and some have said that
   it's impossible to secure computers simply because this attack is possible. Some
   have even said that all those security audits of FLOSS are worthless, because
   subverted tools could insert attacks the auditors couldn't see. But it turns out
   that the trusting trust attack can be countered. My 2005 paper [425]Countering
   Trusting Trust through Diverse Double-Compiling (DDC), published by ACSAC, shows
   how the "uncounterable" trusting trust attack can be countered. But there's a
   catch: the DDC defense only works if you can get the source code for your
   software creation tools, including the operating system, compiler, and so on.
   That kind of information is typically only available for FLOSS programs! Thus,
   even in the case of the dangerous "trusting trust" attack, FLOSS has a security
   advantage.

   FLOSS programs can be evaluated using the formal security evaluations required by
   some government agencies, such as the Common Criteria (ISO Standard 15408) and
   NIST FIPS 140, One complication has been that many governments have assumed that
   vendors would pay for such evaluations on their own. This assumption is a poor
   match for many FLOSS projects, whose business models typically require that users
   who want a particular improvement (such as an evaluation) pay for that
   improvement (in money or effort). This doesn't make formal security evaluations
   of FLOSS projects impossible, but it may require that customers change their
   approach to performing evaluations in some cases. In particular, customers will
   need to not assume that vendors will do evaluations `for free.' Part of the
   problem is that many organizations' acquisition strategies were defined before
   FLOSS became prevalent, and have not yet been adjusted to the widespread presence
   of FLOSS. Some FLOSS programs have multiple project sites, so an organization
   must select exactly what project to evaluate, but that`s not really change;
   evaluations of proprietary programs must select a specific version too.

   Here are several reports on FLOSS program evaluations:
    1. [426]Government Computer News reports that Novell Inc.'s SUSE Linux
       Enterprise Server 9 has achieved Controlled Access Protection Profile (CAPP)
       with EAL 4+ when running on IBM Corp.'s eServers. [427]The U.S. NIAP
       Validated Products List shows that Novell's SuSE Linux Enterprise Server V8
       successfully passed a Common Criteria EAL3+ evaluation against the Controlled
       Access Protection Profile (CAPP) in January 2004.
    2. Red Hat Enterprise Linux 3 passed an EAL2 evaluation in February 2004.
       Various reports in [428]IT Security and by [429]Red Hat state that in August
       2004 Red Hat Enterprise Linux 3 was successfully against the Common Criteria
       EAL 3+ and the Controlled Access Protection Profile (though it hasn't
       appeared in the Validated Products List yet). [430]Red Hat also reports that
       they are working to complete an EAL 4 evaluation (in [431]various forums).
    3. [432]Mandrakesoft (now Mandriva) and others have won a 1 million Euro
       three-year contract to help create a highly secure Linux based solution for
       the French Ministry of Defense that meets Common Criteria Evaluation
       Assurance Level (EAL) 5.
    4. [433]Trusted Computer Solutions Inc. of Herndon, Va., expects to begin
       beta-testing Trusted Linux this fall and seek Common Criteria certification
       at EAL 4 to meet not only the Controlled Access Protection Profile (CAPP),
       but the additional requirements of the Labeled Security Protection Profile,
       the Role-based Access Control Protection Profile, and the requirements of
       Director of Central Intelligence Directive 6/3.
    5. The [434]IBM Crypto for C (ICC) library received a FIPS 140-2 level 1
       certificate #384 in 2004, and it uses the cryptographic library provided by
       FLOSS OpenSSL.
    6. The FLOSS cryptographic library OpenSSL is being evaluated itself using the
       FIPS 140 evaluation process. The [435]OpenSSL FAQ provides more information
       on an effort to evaluate OpenSSL sponsored by HP and the Defense Medical
       Logistics Standard Support program.

   Some other interesting data about security can be found in [436]Google
   Facts/Statistics question about computer security and loss of data.

   The "Alexis de Tocqueville Institute" (ADTI) published a white paper called
   [437]"Opening the Open Source Debate" that purported to examine FLOSS issues.
   Unfortunately, ADTI makes many wrong, specious, and poorly-argued claims about
   FLOSS, including some related to security. Wired (in its article [438]Did MS Pay
   for Open-Source Scare?) made some startling discoveries about ADTI; after
   querying, they found that "a Microsoft spokesman confirmed that Microsoft
   provides funding to the Alexis de Tocqueville Institution... Microsoft did not
   respond to requests for comment on whether the company directly sponsored the
   debate paper. De Tocqueville Institute president Ken Brown and chairman Gregory
   Fossedal refused to comment on whether Microsoft sponsored the report".
   [439]Politech found additional suspicious information about ADTI, and [440]UPI
   reported that ADTI receives a significant portion of its funding from the
   Microsoft Corp, and thus it essentially lobbies in favor of issues important to
   Microsoft. ADTI apparently has a history of creating "independent" results that
   are apparently paid for by corporations (e.g., see the [441]Smoke Free for Health
   article about ADTI's pro-tobacco-lobby papers). Reputable authors clearly
   identify any potential conflict of interest, even if it's incidental; ADTI did
   not. Specific to their report, [442]Andy Tanenbaum has described how Ken Brown of
   ADTI failed to understand the issues and appeared to have an agenda.

   The ADTI paper makes many errors and draws unwarranted conclusions. I'll just
   note a few examples of the paper's problems that aren't as widely noted
   elsewhere: incorrect or incomplete quotations, rewriting web browser history, and
   cleverly omitting the most important data in one of their charts:
     * The ADTI "quotes" me several times in the paper, but in some cases claims I
       said something I never said, and in others places them out of context by
       intentionally omitting important things that I said. ADTI originally claimed
       that I said that "without licensing the source code in a multi-license
       format, (referring to other more permissive licenses), it is impossible for
       GPL to work for a proprietary business model". But I never said this. In
       fact, I specifically noted to ADTI that [443]Microsoft sells a GPL'ed product
       (a fact I'd already publicly published). Instead of removing the statement,
       ADTI later made up a statement and claimed that I said it. What I really said
       was more nuanced: "without licensing the source code in a multi-license
       format [GPL and other licenses], the GPL does not permit certain kinds of
       uses in proprietary business models". The words are similar, but this is a
       much narrower statement. In particular, ADTI's Brown was essentially trying
       to claim that the GPL was essentially incompatible with business, even though
       this wasn't true, I told them it wasn't true, and even provided them with
       examples. ADTI also claims I said that "today I would be confident that the
       number [of GPL software] has probably grown to 80%;" I only said that I
       believed the number was probably larger than 50%, but since I couldn't
       remember the exact figures offhand, I told them to examine my papers - a
       trivial search which ADTI did not do (if they had, they'd notice that I'd
       recently published that [444]71.85% of Freshmeat's software packages were
       covered by the GPL). More intriguing are the omissions. For example, I
       explained to ADTI the GPL license (which they did not understand, even though
       they were attacking it); ADTI seems to think that the GPL requires public
       release of code, but it does not. The GPL only requires that those who
       receive the binary executable receive the source code. This is crucial,
       because it means you can still keep "secrets" in GPL'ed code, in spite of
       ADTI's implied assertion otherwise. Besides, there's anecdotal evidence that
       the government uses most GPL'ed code as-is, in which case these issues don't
       apply - the GPL permits arbitrary use and redistribution of unmodified
       copies.
     * For a second example, the ADTI paper rewrites the history of web browsers in
       an attempt to make its claims; it bases much on the claim that Mosaic was an
       open source web browser, but [445]it never was; modified versions of the Unix
       version could only be used non-commercially without a separate license (FLOSS
       must be usable commercially), and the Mac and Windows licenses were even more
       restrictive. It also completely omits the heavily publicized move of Netscape
       to FLOSS in 1998, clearly the most important event in web browser history
       relating to FLOSS. I specifically mentioned these problems to ADTI before
       they published their paper, but ADTI was not willing to fix their paper to
       meet the facts.
     * Switching to the third example, ADTI includes a chart of showing source lines
       of code (SLOC) for various programs; it even references my paper [446]More
       than a Gigabuck while noting that the Linux kernel is over 2 million SLOC.
       The same chart also reports that Windows XP is 30 million SLOC, an
       interesting statement since to my knowledge this value has not been made
       public (ADTI has not revealed their source, but has confirmed to me that they
       really meant Windows XP). But note the invalid comparison - ADTI reports on
       the Linux kernel (a small part of an OS), and Windows XP (a whole OS), but
       not on an whole FLOSS OS. ADTI willfully ignores my paper's abstract and main
       point, which reported that the whole Red Hat Linux 7.1 distribution is also
       30 million SLOC; by omitting the most important data, ADTI gives false
       impressions. But these are merely the tip of the iceberg; the paper's flaws
       are so numerous, and discussing the flaws in its conclusions require so much
       effort, that a serious rebuttal would require writing a whole separate paper.

   Thus, I recommend that anyone who reads the ADTI paper also examine the detailed
   rebuttals available from many different sources, since these rebuttals expose the
   paper's numerous flaws. Rebuttals are available from [447]John Viega and Bob
   Fleck of Secure Software (Viega is a respected security expert), [448]Juliao
   Duartenn (Director of the Security Skill Center, Oblog Software, SA),
   [449]Roaring Penguin's David Skoll (via the Register), [450]Ken Ambrose (via
   LWN), and [451]Leon Brooks. [452]Anthony Awtrey analyzed the changes made in the
   published editions of the ADTI paper. Operating system expert [453]Andrew
   Tanenbaum responded to ADTI's later claim that Torvalds stole Linux, and found
   that ADTI's Ken Brown "doesn't have a clue what he is talking about," was
   "confused about patents, copyrights, and trademarks," failed to even do basic
   research (he failed to consider original sources and didn't bother to read the
   major works on his subjects), and wrote "patent nonsense". In short, ADTI's paper
   is a highly biased and poorly researched "report".

   All of this is unfortunate, because the real Alexis de Tocqueville strongly
   approved of the FLOSS's underlying approaches. [454]Alexis de Tocqueville
   remarked on the extraordinary success in the United States of voluntary community
   associations to do many tasks, and viewed them extremely favorably. He found such
   associations to be remarkably effective.

   There are other non-quantitative discussions on FLOSS and security. [455]The
   October 2002 paper Open Source Digital Forensics Tools: The Legal Argument by
   Brian Carrier notes that to enter scientific evidence into a United States court,
   a forensics tool must be reliable and relevant as determined through the
   "Daubert" guidelines. The paper examines then those guidelines and argues that
   "open source tools may more clearly and comprehensively meet the [forensics]
   guidelines than closed source tools". Stacey Quandt's [456]"Linux and Windows
   security compared" compares Windows and GNU/Linux security qualitatively; she
   concludes that they're comparable in network security/protocols, deployment and
   operations, and trusted computing; Linux is superior in base security,
   application security, and open standards. The only area where Windows was ahead
   was in assurance, because an EAL4 Common Criteria evaluation has been completed
   for Windows; an EAL3 evaluation for a GNU/Linux has completed, but an EAL4
   evaluation for a GNU/Linux is in process but not yet complete. Since an EAL4
   GNU/Linux evaluation is expected to complete by around the end of 2004, this
   doesn't appear to be a long-lasting advantage for Windows.

   Many security experts have stated that FLOSS has advantages over the security of
   proprietary software, including [457]Whitfield Diffie (co-inventor of public key
   cryptography), Bruce Schneier (expert on cryptography and computer security),
   Vincent Rijmen (a developer of the Advanced Encryption Standard (AES)),
   [458]Elias Levy (Aleph1, the former moderator of the popular security discussion
   group Bugtraq), [459]John Viega (author of a book on secure programming),
   [460]Kenneth van Wyk, and Peter Neumann (long-time expert on security). A
   humorous article expressing this view is the article [461]Microsoft Windows: A
   lower Total Cost of 0wnership (0wnership starts with zero, not the letter O; 0wn
   is slang for gaining illicit remote administrative control over someone else's
   computer). This article by Immunix, Inc., compares the security of Microsoft
   Windows and OSS systems based on their technology characteristics, and declares
   that the "best platform for your targets [victims] to be running is Microsoft
   Windows, allowing you unparalleled value for their dollar" (see the next section
   for the more traditional [462]Total Cost of Ownership information). This doesn't
   guarantee that a particular FLOSS program is more secure than a particular
   proprietary product - merely that there are some fundamental security advantages
   to easing public review.

   And it's worth noting that the better distributions, who job includes ensuring
   that their packages don't have known vulnerabilities, seem to take their job
   seriously. Mark J. Cox has posted a summary of [463]how Red Hat ensured that
   Fedora Core 4 didn't include any known vulnerabilities (through an auditing
   process); [464]Debian does similar types of analysis.

   In contrast, [465]Microsoft's Jim Allchin disclosed under oath in court testimony
   that some Microsoft code was so flawed it could not be safely disclosed to the
   public. Yet more recently, Microsoft announced its "Government Security Program"
   to allow governments to view most source code (though not all code, and they
   cannot change and freely redistribute the results). Indeed, Reuters reported a
   survey by Forrester Research Inc. that found that [466]most computer security
   experts at major companies do not think Microsoft Corporation's products are
   secure; 77% said security was a top concern when using Windows. The primary
   problem reported was that patches were not implemented, because "administrators
   lacked both the confidence that a patch won't bring down a production system and
   the tools and time to validate Microsoft's avalanche of patches". If you need to
   secure Windows, feel free to look at my essay on [467]how to secure Microsoft
   Windows (for home and small business users); while many issues are true for any
   system, there are also a number of security problems that are essentially unique
   to Windows.

   Specialized applications may need high assurance software. If you are interested
   in that, see my essay [468]High Assurance (for Security or Safety) and Free-Libre
   / Open Source Software (FLOSS).

   Now it should be obvious from these figures that FLOSS systems are not magically
   invincible from security flaws. Indeed, some have argued that making the source
   code available gives attackers an advantage (because they have more information
   to make an attack). While FLOSS gives attackers more information, this ignores
   opposing forces: having the source code also gives the defenders more information
   (because they can also examine its original source code), and in addition, the
   defenders can improve the code. More importantly, the necessary information for
   breaking into a program is in the binary executable of the program; disassemblers
   and decompilers can quickly extract whatever information is needed from
   executables to break into a program, so hiding the source code isn't all that
   helpful for preventing attacks against attackers who are willing to use such
   programs. Even if source code were required (it's not), source code can often be
   acquired by attackers, either by simply asking for it (in exchange for funds) or
   by [469]acquiring the source code itself by attack. Again, it is not true that
   proprietary programs are always more secure, or that FLOSS is always more secure,
   because there are many factors at work. Writing secure software does require that
   developers know how to do it, but there's no evidence that proprietary software
   developers in general have more such knowledge; indeed, since many developers
   create both proprietary and FLOSS programs, it's unlikely there's a major
   difference, and FLOSS encourages code review in a way that few proprietary
   projects match. It is also greatly enhanced by review; certainly not all FLOSS
   programs are reviewed for security, but many are, both by other developers and by
   others (for example, [470]one group of students was assigned the task of finding
   and reporting vulnerabilities, and reported 44). And clearly, any vulnerabilities
   must be fixed and distributed. Note that a well-configured and well-maintained
   system, of any kind, will almost always be far more secure than a poorly
   configured and unmaintained system of any kind over the long term. For a longer
   description of these issues, see [471]my discussion on open source and security
   (part of my book on [472]writing secure software). However, from these figures,
   it appears that FLOSS systems are in many cases better - not just equal - in
   their resistance to attacks as compared to proprietary software.

                             7. Total Cost of Ownership (TCO)

   Total cost of ownership (TCO) is an important measure; it doesn't matter if a
   product starts out cheaply if it costs you more down the line. However, TCO is
   extremely sensitive to the set of assumptions you make.

   Indeed, whatever product you use or support, you can probably find a study to
   show it has the lowest TCO for some circumstance. Not surprisingly, both
   Microsoft and [473]Sun provide studies showing that their products have the
   lowest TCO. [474]Xephon has a study determining that mainframes are the cheapest
   per-user (due to centralized control) at £3450 per user per year; Centralized
   Unix cost £7350 per user per year, and a decentralized PC environment costs
   £10850 per user per year. [475]Xephon appears to be a mainframe-based
   consultancy, though, and would want the results to come out this way. There are
   indeed situations where applying a mainframe makes sense.. but as we'll see in a
   moment, you can use FLOSS in such environments too.

   In short, what has a smaller TCO depends on your needs and your environment.
   First, identify what the requirements are, including the types of applications.
   You must then determine the architectural options that meet these requirements.
   For example, GNU/Linux systems can be implemented as independent client systems
   with a few common servers, just like most Windows systems are. But there are many
   architectural alternatives, such as using X-Windows terminals (programs run on a
   central server (so the client systems can be extremely low-end "throw-away"
   systems), clustering (where tasks can be divided among many computers), or use
   [476]Stateless Linux (programs run locally on the computer, but since nothing is
   stored locally, anyone can log into any computer later).

   Then, to determine TCO you must identify all the important cost drivers (the
   "cost model") and estimate their costs. Don't forget "hidden" costs, such as
   administration costs, upgrade costs, technical support, end-user operation costs,
   and so on. [477]Computer Sciences Corporation' study "Open Source: Open for
   Business" (pp. 39-43) identifies the TCO factors that it believes are most
   important for evaluating FLOSS with proprietary software: hardware costs
   (including purchase price and hardware maintenance), direct software costs
   (including purchase price and support and maintenance), indirect software costs
   (especially administration of licenses), staffing costs, support costs, and
   downtime (CSC claims that the "modularity of Linux can allow a very lean build to
   be deployed, which in turn can enable more stability..".).

   To be honest, the term "TCO" is common but misleading for most software,
   especially for proprietary software, because software users often don't own the
   software they use and thus don't have the rights of ownership. It might be more
   accurate to say that proprietary software users often "lease" or "rent" the
   software, and thus this category could more accurately be named "total cost to
   lease or own". Fundamentally, unless you arrange to have a software program's
   copyright transferred to you, you do not actually own the software -- you only
   own a license to run the software in certain limited ways. That's an important
   distinction; in particular, with proprietary software you typically do not have
   the rights associated with ownership. When you pay to own a physical product (say
   a building or computer hardware), you typically have nearly unlimited rights to
   modify and resell the product you bought (subject to legal limits that prevent
   harm to others like zoning laws and limits on electromagentic emissions). In
   contrast, with nearly all proprietary software, you do not have the right to
   modify the software to suit your needs. Many proprietary licenses are even more
   stringent; they typically forbid reverse engineering the product to understand
   what it does (say, to examine its security), forbid publishing benchmarks or
   reviews without approval by the vendor, and often forbid (sub)leasing, reselling,
   or redistributing the product. These kinds of limits make proprietary software
   users more like leasee or a renter of a building, who can occupy a space but
   cannot modify or sublease the space. Some proprietary software programs are sold
   for use only over a period of time, and thus the analogy to renting is especially
   easy to see. But though there are many proprietary software programs that are
   sold with a one-time cost (a "perpetual" license), in reality these programs also
   impose recurring fees, such as upgrade costs to continue to use the programs on
   newer hardware and operating systems, upgrades so that your software will
   continue to be compatible with others' copies and with other software, and
   various support fees, and so even so-called perpetual licenses have recurring
   costs like a typical rent or lease. This isn't necessarily terrible, and I'm
   certainly not going to say that such arrangements are unethical; people decide to
   rent or lease physical property too! But it's important to understand what the
   transaction entails. For more on this topic, see [478]Dr. Debora Halbert's The
   Open Source Alternative: Shrink-Wrap, Open Source and Copyright, particularly
   point 22. As explained by [479]Ross Anderson's Trusted Computing (TC) Frequently
   Asked Questions (FAQ), vendors are already working to build mechanisms to enforce
   this even more strongly, because so-called "trusted computing" transfers control
   of your computer from you to the vendors (the FSF calls this technology
   "treacherous computing" because while the computer is more trustworthy for users,
   it does this by becoming less trustworthy by owners). As Anderson says, "TC will
   also make it easier for people to rent software rather than buy it; and if you
   stop paying the rent, then not only does the software stop working but so may the
   files it created. So if you stop paying for upgrades to Media Player, you may
   lose access to all the songs you bought using it". Users of FLOSS software aren't
   actually owners either, and they have some of the same types of recurring costs
   (such as support). On the other hand, the rights FLOSS users are granted (users
   can understand, publicly comment on, modify, and redistribute the software -- and
   all this in perpetuity) are far closer to an owner's rights than the rights
   granted to a proprietary software user.

   There's another problem in thinking that people really "own" software:
   maintenance matters. If a proprietary software company goes out of business, the
   value of the software it sold immediately plummets to near zero. This is not how
   people react when they purchase land or other real property; the value of the
   property does not diminish just because the seller is going out of business. This
   suggests that when users are purchasing software, they're really purchasing
   future support and upgrades as well. [480]Robert Lefkowitz's "Calculating the
   True Price of Software" argues that FLOSS pricing essentially splits the costs of
   initial value and the value of options on future maintenance, converting warrants
   on future maintenance and enhancements into options, so that instead of having a
   sole supplier (warrants), it creates a third-party market (options) of these
   derivatives.

   FLOSS has many strong cost advantages in various categories that, in many cases,
   will result in its having the smallest TCO:
    1. FLOSS costs less to initially acquire. FLOSS costs much less to get
       initially. FLOSS isn't free in the monetary sense, because the "free" in
       "free software" refers to freedom, not price. This distinction is usually
       summarized as "free speech, not free beer". [481]Merrill Lynch executive
       Robert Lefkowitz found what may be a better way to describe it: "We like to
       think of it as `free as in market.'"
       FLOSS isn't cost-free, because you'll still spend money for paper
       documentation, support, training, system administration, and so on, just as
       you do with proprietary systems. In many cases, the actual programs in FLOSS
       distributions can be acquired freely by downloading them ([482]linux.org
       provides some pointers on how to get distributions). However, most people
       (especially beginners and those without high-speed Internet connections) will
       want to pay a small fee to a distributor for a nicely integrated package with
       CD-ROMs, paper documentation, and support. Even so, FLOSS costs far less to
       acquire.
       For example, examine the price differences when trying to configure a server,
       such as public web server or an intranet file and email server, in which
       you'd like to use C++ and an RDBMS. This is simply an example; different
       missions would involve different components. Using the prices from "Global
       Computing Supplies" (Suwanee, GA), September 2000, rounded to the nearest
       dollar, here is a quick summary of the purchasing costs:

     Microsoft Windows 2000 Red Hat Linux
   Operating System $1510 (25 client) $29 (standard), $76 deluxe, $156 professional
   (all unlimited)
   Email Server $1300 (10 client) included (unlimited)
   RDBMS Server $2100 (10 CALs) included (unlimited)
   C++ Development $500 included

       Basically, Microsoft Windows 2000 (25 client) costs $1510; their email server
       Microsoft Exchange (10-client access) costs $1300, their RDBMS server SQL
       Server 2000 costs $2100 (with 10 CALs), and their C++ development suite
       Visual C++ 6.0 costs $500. Red Hat Linux 6.2 (a widely-used GNU/Linux
       distribution) costs $29 for standard (90 days email-based installation
       support), $76 for deluxe (above plus 30 days telephone installation support),
       or $156 for professional (above plus SSL support for encrypting web traffic);
       in all cases it includes all of these functionalities (web server, email
       server, database server, C++, and much more). A public web server with
       Windows 2000 and an RDBMS might cost $3610 ($1510+$2100) vs. Red Hat Linux's
       $156, while an intranet server with Windows 2000 and an email server might
       cost $2810 ($1510+$1300) vs. Red Hat Linux's $76.
       Both packages have functionality the other doesn't have. The GNU/Linux system
       always comes with an unlimited number of licenses; the number of clients
       you'll actually use depends on your requirements. However, this certainly
       shows that no matter what, Microsoft's server products cost thousands of
       dollars more per server than the equivalent GNU/Linux system.
       For another in-depth analysis comparing the initial costs GNU/Linux with
       Windows, see [483]Linux vs. Windows: The Bottom Line by [484]Cybersource Pty
       Ltd. Here's a summary of their analysis (in 2001 U.S. dollars):

     Microsoft Solution FLOSS (GNU/Linux) Solution Savings by using GNU/Linux
   Company A (50 users) $69,987 $80 $69,907
   Company B (100 users) $136,734 $80 $136,654
   Company C (250 users) $282,974 $80 $282,894

       [485]Consulting Times found that as the number of mailboxes got large, the
       three-year TCO for mainframes with GNU/Linux became in many cases quite
       compelling. For 50,000 mailboxes, an Exchange/Intel solution cost $5.4
       million, while the Linux/IBM(G6) solution cost $3.3 million. For 5,000
       mailboxes, Exchange/Intel cost $1.6 million, while Groupware on IFL cost
       $362,890. For yet another study, see the [486]Cost Comparison from jimmo.com.
       Obviously, the price difference depends on exactly what functions you need
       for a given task, but for many common situations, GNU/Linux costs far less to
       acquire.
    2. Upgrade/maintenance costs are typically far less. Long-term upgrade costs are
       far less for FLOSS systems. For example, upgrading a Microsoft system will
       typically cost around half the original purchase. What's worse, you are
       essentially at their mercy for long-term pricing, because there is only one
       supplier (see [487]Microsoft Turns the Screws). In contrast, the GNU/Linux
       systems can be downloaded (free), or simply re-purchased (generally for less
       than $100), and the single upgrade be used on every system. This doesn't
       include technical support, but the technical support can be competed (a
       situation that's not practical for proprietary software). An anti-trust
       lawyer would say that FLOSS technical support is "contestable". In short, if
       you don't like your GNU/Linux supplier (e.g., they've become too costly), you
       can switch.
    3. FLOSS does not impose license management costs, does not in practice include
       noxious licensing clauses, and avoids nearly all licensing litigation risks.
       Proprietary vendors make money from the sale of licenses, and are imposing
       increasingly complex mechanisms on consumers to manage these licenses.
       Customers who cannot later prove than they paid for every installed copy of
       proprietary software (e.g., due to copying by an employee or losing the
       license paperwork) risk stiff penalties. In short: by using proprietary
       software, you run the risk of having the vendor sue you.
       To counter these risks, organizations must keep careful track of license
       purchases. This means that organizations must impose strict software license
       tracking processes, purchase costly tracking programs, and pay for people to
       keep track of these licenses and perform occasional audits.
       Organizations must also be careful to obey licensing terms, some of which may
       be extremely noxious or risky to the user. Those who think that proprietary
       software gives them "someone to sue" are in for a rude awakening --
       practically all software licenses specifically forbid it. [488]A Groklaw
       article contrasted the terms of the GPL vs. the Windows XP End-User License
       Agreement (EULA) terms, and stated that Windows XP's license was far more
       dangerous to users. For example, it requires a mandatory activation (where
       you reveal yourself to the vendor), it allows the vendor to modify your
       computer's software at will, the vendor may collect personal data about you
       without warning or limitation, and the vendor can terminate the agreement
       without due process. [489]Con Zymaris has published a detailed comparison of
       the GPL and the Microsoft EULA. Both note, for example, that if things go
       awry, you can get no more than $5 from the Microsoft EULA. Indeed, [490]many
       common EULAs now include dangerous clauses.
       In contrast, there's no license management or litigation risk in simply using
       FLOSS software. Some FLOSS software do have legal requirements if you modify
       the program or embed the program in other programs, but proprietary software
       usually forbids modifying the program and often also imposes licensing
       requirements for embedding a program (e.g., royalty payments). Thus, software
       developers must examine what components they're employing to understand their
       ramifications, but this would be true for both FLOSS and proprietary
       programs. See the [491]licensing litigation discussion later in this paper
       for more about licensing costs and risks.
    4. FLOSS can often use older hardware more efficiently than proprietary systems,
       yielding smaller hardware costs and sometimes eliminating the need for new
       hardware. FLOSS runs faster on faster hardware, of course, but many FLOSS
       programs can use older hardware more efficiently than proprietary systems,
       resulting in lower hardware costs - and in some cases requiring no new costs
       (because "discarded" systems can suddenly be used again). For example, the
       [492]minimum requirements for Microsoft Windows 2000 Server (according to
       Microsoft) are a Pentium-compatible CPU (133 MHz or higher), 128 MiB of RAM
       minimum (with 256MiB the "recommended minimum"), and a 2 GB hard drive with
       at least 1.0 GB free. According to Red Hat, Red Hat Linux 7.1 (a common
       distribution of GNU/Linux) requires at a minimum an i486 (Pentium-class
       recommended), 32MiB RAM (64MiB recommended), and 650MB hard disk space (1.2
       GB recommended).
       In Scientific American's August 2001 issue, the article [493]The
       Do-It-Yourself Supercomputer discusses how the researchers built a powerful
       computing platform with many discarded computers and GNU/Linux. The result
       was dubbed the "Stone Soupercomputer"; by May 2001 it contained 133 nodes,
       with a theoretical peak performance of 1.2 gigaflops.
    5. When used as an application server based system, the total costs for hardware
       drop by orders of magnitude. Many people make the mistake of deploying FLOSS
       workstations (such as GNU/Linux or the *BSDs) the same way they would deploy
       Windows systems. Although it's possible, this is an unnecessarily costly
       approach if they're installing a set of workstations for typical productivity
       applications (e.g., word processing, spreadsheets, etc. for an office), For
       many, a better approach is to provide each user with a very old
       GNU/Linux-based machine which is merely a graphics display (an "X terminal"),
       and then run the actual applications on an "application server" that is
       shared by all the users. See [494]How to create a Linux-based network of
       computers for peanuts for more information about this. With this application
       server approach, workstations can cost about $30 each (using "obsolete"
       machines), a server (shared by many users) can cost about $1000 each, and
       nearly all system administration is centralized (reducing administration
       costs). A nice side-effect of this approach is that users can use any
       workstation just by logging in. A more detailed discussion of this approach
       is given in [495]Paul Murphy's article, Total cost of ownership series
       revisited. [496]Linux Style: Windows PCs vs. X Terminals: A Cost Comparison
       describes how the Mark O. Hatfield Library at Willamette University has used
       networked X terminals in its public and staff computing environments since
       1995. The 15-year cost of 25 Linux systems in this environment is estimated
       to be $41,359 versus a 15-year cost of $100,000 to $155,000 for Windows PCs
       serving the same function. This is how the City of Largo, Florida, and many
       other organizations use GNU/Linux.
    6. FLOSS tends to require less ongoing administration; a survey of European
       governments found that administrators of FLOSS systems can handle 35% more
       PCs per IT administrator than administrators of proprietary systems.
       [497]FLOSSPOLS' "Results and policy paper from survey of government
       authorities" (Deliverable D3) did a survey in March 2005 of 955 European
       local governments. It found that "FLOSS users administer 35% more PCs per IT
       administrator than non-users -- FLOSS use appears to reduce administrator
       workload per PC, and IT departments with high workloads are more likely to
       want a future increase in FLOSS use". About half (49%) of local government
       authorities reported intentionally using FLOSS, but a huge additional portion
       (29%) were definitely using FLOSS (GNU/Linux, MySQL or Apache) and were
       unaware that these were FLOSS; I suspect that the true percentage of users
       was probably even higher. Once people started using it, they wanted more; 70%
       of FLOSS users wanted to increase its use. [498]Groklaw summarized this
       FLOSSPOLS survey.
    7. As the number of systems and hardware performance increases, this difference
       in initial and upgrade costs becomes even more substantial. As the number of
       servers increases, proprietary solutions become increasingly costly. First,
       many proprietary systems (including Microsoft) sell per-client licenses; this
       means that even if your hardware can support more clients, you'll must pay
       more to actually use the hardware you've purchased. Secondly, if you want to
       use more computers, you must pay for more licenses in proprietary systems. In
       contrast, for most GNU/Linux distributions, you can install as many copies as
       you like for no additional fee, and there's no performance limit built into
       the software. There may be a fee for additional support, but you can go to
       competing vendors for this support.
       According to [499]Network World Fusion News, Linux is increasingly being used
       in healthcare, finance, banking, and retail due to its cost advantages when
       large numbers of identical sites and servers are built. According to their
       calculations for a 2,000 site deployment, SCO UnixWare would cost $9 million,
       Windows would cost $8 million, and Red Hat Linux costs $180.
    8. There are many other factors; their effect varies on what you're trying to
       do. There are many other factors in TCO, but it's difficult to categorize
       their effects in general, and it's generally difficult to find justifiable
       numbers for these other effects. Windows advocates claim that system
       administrators are cheaper and easier to find than Unix/Linux administrators,
       while GNU/Linux and Unix advocates argue that fewer such administrators are
       needed (because administration is easier to automate and the systems are more
       reliable to start with). Various reports have mentioned this (a [500]Red Hat
       executive stated that one Wall Street bank has one administrator for 800
       machines), [501]quantitative studies are beginning to back this claim that
       fewer administrators are needed. Some GNU/Linux advocates have told me that
       GNU/Linux lends itself to hosting multiple services on one server in cases
       where Windows installations must use multiple servers. License compliance
       administration can be costly for proprietary systems (e.g., time spent by
       staff to purchase CALS, keep track of licenses, and undergo audits) - a cost
       that simply isn't relevant to FLOSS.
    9. A European Commission-sponsored study The reported savings in nearly all
       cases from using FLOSS [502]Study on the: Economic impact of open source
       software on innovation and the competitiveness of the Information and
       Communication Technologies (ICT) sector in the EU (November 20, 2006) said
       "Our findings show that, in almost all the cases, a transition toward open
       source reports of savings on the long term " costs of ownership of the
       software products... Costs to migrate to an open solution are relevant and an
       organization needs to consider an extra effort for this. However these costs
       are temporary and mainly are budgeted in less than one year... Our findings
       report no particular delays or lost of time in the daily work due to the use
       of OpenOffice.org.... OpenOffice.org has all the functionalities that public
       offices need to create documents, spreadsheets, and presentations...
       OpenOffice.org is free, extremely stable, and supports the ISO Open Document
       Standard". A [503]Groklaw article on this study summarizes the report This
       study presents a lot of quantitative data on other FLOSS topics as well.
   10. Cybersource's 2002 study found TCO savings of 24% to 34% when using FLOSS
       instead of Microsoft's proprietary approach; their 2004 study found TCO
       savings from 19% to 36%. [504]Cybersource's 2004 update of their "Linux vs.
       Windows: Total Cost of Ownership Comparison" (as noted in the [505]press)
       found 19% to 36% savings using Linux, compared to Microsoft Windows,
       depending on various factors (see their paper). This is basically an update
       of [506]Cybersource's "Linux vs. Windows: Total Cost of Ownership Comparison"
       2002 study, which modeled an organization with 250 computer-using staff, an
       appropriate number of workstations, servers, with Internet connectivity, an
       e-business system, network cabling and hardware, standard software, and
       salaries for IT professionals to establish and support this infrastructure
       and technology. Using existing hardware and infrastructure, they found a
       three-year savings of 34.26% ($251,393 U.S. dollars) when using the
       "Linux/Open Source Solution" instead of the proprietary "Microsoft solution".
       When new hardware and infrastructure were purchased, the savings were 24.69%.
       Note that this study is a follow-on of [507]an even earlier study; a
       [508]commentary is available at Linux Journal. It could be argued that this
       was merely a paper study, but they claim that they've seen significant
       savings in their consulting work. It's also fair to note that this
       organization is pro-FLOSS. In any case, TCO savings have been reported by
       real organizations, corroborating these results, as discussed below.
   11. An Italian study in 2002 found GNU/Linux to have a TCO 34.84% less than
       Windows. The [509]full study is in Italian; you can try to read an
       automatically-generated [510]translation.
   12. Forrester Research found that the average savings on TCO when using FLOSS
       database management systems (DBMSs) is 50%. The November 2006 article
       [511]"Open source databases `60 percent cheaper'" reports details of a
       Forrester study, where average TCO savings were determined to be 50%, and in
       some cases up to 60%.
   13. For many circumstances, the total cost savings can be substantial. For
       example, real-world savings exceeding $250,000 per year were reported by 32%
       of the Chief Technical Officers (CTOs) surveyed in a 2001 InfoWorld survey;
       60% of these CTOs saved over $50,000 annually. The August 27, 2001 InfoWorld
       (pages 49-50) reported on a survey of 40 CTOs who were members of the
       InfoWorld CTO network. In this survey, 32% using OSS reported savings greater
       than $250,000; 12% reported savings between 100,001 and $250,000; and 16%
       reported saving between $50,001 and $100,000. Indeed, only 8% reported annual
       savings less than $10,000 (so 92% were saving $10,000 or more annually). A
       chief benefit of OSS, according to 93% of the CTOs, was reduced cost of
       application development or acquisition; 72% said that a chief benefit was
       reduced development or implementation time (multiple answers were allowed).
       The CTOs reported using or planning to use OSS for web servers (65%), server
       OSes (63%), web-application servers (45%), application development testing
       (45%), and desktop OS (38%), among other uses. InfoWorld summarized it this
       way: "in early 2000, it seemed as if no one was using open-source software
       for business-critical tasks... a vast majority of today's corporate IT
       executives are now using or plan to use OSS OSes and web servers for their
       enterprise applications".
   14. The Robert Frances Group's July 2002 study found the TCO of GNU/Linux is
       roughly 40% (less than half) that of Microsoft Windows and only 14% that of
       Sun Microsystem's Solaris. [512]The Robert Frances Group (RFG), in Westport,
       Conn., studied actual costs at production deployments of Web servers running
       on GNU/Linux with Apache, Microsoft Windows with IIS, and Sun Solaris with
       Apache at 14 Global 2000 enterprises. These are real deployments where, if
       the web server goes down, money is lost - not minor prototype sites. Their
       TCO analysis was based on the software purchase price, hardware purchase and
       maintenance prices, software maintenance and upgrade prices, and
       administrative costs. To make the numbers comparable, these figures were were
       scaled to a "processing unit" able to handle 100,000 hits per day; see the
       study for more information. They determined that over three years a (scaled)
       GNU/Linux deployment cost $74,475, a Windows deployment cost $190,662, and a
       Solaris deployment cost $534,020. Thus, the cost of running GNU/Linux is
       roughly 40% that of Microsoft Windows and only 14% that of Sun Microsystem's
       Solaris.
       This report also found that GNU/Linux and Solaris had smaller administrative
       costs than Windows. Although Windows system administrators cost less
       individually, each Linux or Solaris administrator could administrate many
       more machines, making Windows administration much more costly. The study also
       revealed that Windows administrators spent twice as much time patching
       systems and dealing with other security-related issues than did Solaris or
       GNU/Linux administrators.
       RFG also examined some areas that were difficult to monetize. In the end,
       they concluded that "Overall, given its low cost and flexible licensing
       requirements, lack of proprietary vendor goals, high level of security, and
       general stability and usability, Linux is worth considering for most types of
       server deployments".
   15. In August 2005, Robert Frances Group (RFG) found Linux on x86 had a
       significantly lower TCO than Windows (40% less) or Solaris (54% less) as an
       application server. [513]Robert Frances Group's August 2005 study, funded by
       IBM, examining GNU/Linux, Windows, and Solaris when used as J2EE application
       server (e.g., for typical business intranet applications). The GNU/Linux
       systems had a 3-year TCO of $40,149, compared to Microsoft Windows' $67,559
       (both on x86) and Solaris' $86,478 (on SPARC). This was based on a "3-year
       period of ownership for a system supporting 100,000 operations per second on
       the SPECjbb(R) benchmark".
       They included total costs, not just initial purchase price; TCO included
       hardware acquisition, software license and maintenance, OS support and
       systems administration, and application server support / system
       administration costs. RFG was surprised how much more expensive Windows
       hardware was; GNU/Linux systems were able to use much less expensive systems
       and more fully use the raw computing capacity to support the workload. (It
       may also be that the GNU/Linux users were more confident in the system
       reliability and security, and thus willing to use the same hardware for more
       simultaneous functions.) They also found that Windows required more
       administration time than either GNU/Linux or Solaris, and that GNU/Linux
       systems tended to need the fewest number of systems to do the same job; as a
       result, the administration costs were lowest in GNU/Linux.
   16. Netproject reported that the TCO with Linux on the desktop was 35% that of
       Microsoft Windows (a 65% savings). [514]Netproject's Cost of Ownership report
       found a very significant savings, and it reported the following causes:
          + The elimination of license fees for both the system software and office
            software;
          + Elimination of vendor churn that forces unnecessary software updates;
          + Reduction in the number of software security updates;
          + No need for anti-virus software for Linux computers [anti-virus software
            for Linux is only needed to check for viruses that run on Microsoft
            PCs];
          + Reduction in the number of support staff.
   17. A set of 2003 Gartner studies notes that the TCO of Linux (or FLOSS) on the
       desktop depends on your situation, but on average Linux cost less when used
       on the desktop. [515]Gartner reported that that enterprises that installed
       Linux on client desktops would save $80 in hardware acquisition costs and an
       average of $74 per user per year on office automation software (assuming that
       StarOffice will be purchased instead of Microsoft Office). However, they also
       note that "lost productivity stemming from learning curves and compatibility
       can eat up direct-cost savings when moving to Linux on the desktop". A key
       issue is that many organizations have built or bought specialized
       applications that only run on Windows. Note that these studies primarily
       examine Linux vs. Windows on the client desktop, not other FLOSS deployment
       options (such as moving to web-based applications using FLOSS tools that work
       with any client operating system, or using FLOSS applications on Windows).
       Gartner concludes that both Windows and GNU/Linux can have a lower TCO,
       depending on your circumstance, and that "before migrating your desktop
       computers to Linux, take inventory of your business applications and compare
       Linux to Windows in terms of total cost of ownership".
   18. Enterprise Management Associates' February 2006 report claimed Linux tended
       to have a lower TCO than Windows. Enterprise Management Associates (EMA)'s
       report [516]Get the Truth on Linux Management, co-sponsored by Levanta (a
       specialist in Linux management and data virtualization) and the Open Source
       Development Labs, Inc. (OSDL), determined that "Sophisticated management
       tools now allow Linux management to be fast, effective, and inexpensive".
       They studied over 200 Linux-using enterprises, and found a number of
       statistical results, such as (and I quote):
          + Most Linux administrators spend less than 5 minutes per server per week
            on patch management; sophisticated tools reduce this even further.
          + Most respondant reported 99.99% or higher availability; 17% reported no
            downtime at all.
          + Linux acquisition costs can be almost $60,000 less per server only
            considering software; hardware also tended to cost less.
          + Linux administrators tend to be able to manage more servers than Windows
            administrators, and the Linux systems tend to handle greater workloads
          + Respondants strongly endorsed Linux as inherently less vulnerable.
       The sponsors obviously have a bias, but this report is trying to counter an
       alternative biased source.
   19. A majority of InternetWeek Newsbreak subscribers from companies with over $5
       million in revenues reported that FLOSS software costs substantially less
       than proprietary software.
       [517]A survey was by TheOpenEnterprise.com (a joint editorial effort between
       InternetWeek.com and InformationWeek) of individuals with management
       responsibility for IT and software specifically in companies with over $5
       million in revenue. In this survey, 39% said "open source/standards-based
       software" costs 25% to 50% less than proprietary software, while 27% (over 1
       in 4) said it' costs 50% to 75% less. In context, it appears their phrase was
       intended to mean the same (or similar) thing as the term FLOSS in this paper,
       since in many cases they simply use the term "open-source". As they note,
       "Would your CFO react favorably to a 50-75% reduction in software costs?"
   20. A report by Research and Markets found a number of cases where deploying open
       source software resulted in significant savings. The report [518]Saving Cash:
       A Comparison of Open Source and Proprietary Software (Oct 2004, 95 pages) on
       FLOSS in Germany shows significant saving potentials through the deployment
       of open source software for different company sizes. The study found that the
       risk for users on account of copyright or patent violations is minimal. A set
       of interviews were used to create a detailed TCO model, and they perform
       calculations with typical case studies. Warning: This is an expensive report.
   21. The UK Government's British Educational Communications and Technology
       Association (Becta) found that using FLOSS could save a significant amount of
       money in primary and secondary schools. Becta is wrapping up a 3-year study
       that analyzed a sample of 15 schools who use FLOSS programs, comparing their
       costs (and other factors) to 45 (originally 33) schools which use proprietary
       software. Becta found that secondary schools could reduce their information
       technology overheads by 24% (including software, hardware, and support costs)
       by switching to FLOSS. Primary schools could cut their computer costs by
       nearly half using FLOSS. Support costs (usually 60% of a PC's total cost) had
       the biggest reduction in cost. Initial hardware costs were also lower,
       because the FLOSS (which they call FLOSS) required less expensive hardware
       compared to the proprietary solutions. They concluded that "FLOSS can be
       implemented successfully with cost benefits" and that "Use of office based
       FLOSS offers a cost-effective alternative to proprietary solutions". The case
       studies showed that the cost advantages of FLOSS "were often used to increase
       provision, rather than reduce overall budgets in schools".
       These results have been widely reported; see reports from [519]the Times
       Educational Supplement (TES), [520]ZDNet UK, [521]silicon.com, and [522]eGov
       monitor. Note that [523]Schoolforge has detailed report from a 14 April 2005
       meeting summarizing the report.
   22. Many organizations report significant savings when using FLOSS. Here are a
       few examples of specific organizations saving money through FLOSS:
         a. The analysis [524]Linux as a Replacement for Windows 2000 compares Red
            Hat Linux 7.1 to Windows 2000; in this customer's case, using Linux
            instead of Windows 2000 saved $10,000. The reviewer came from a
            Windows/DOS background, and after performing an intensive hands-on Linux
            project lasting several months, determined that "you will be stunned by
            the bang for the buck that ... open source software offers".
         b. Intel's IT Vice President, Doug Busch, [525]reported savings of $200
            million by replacing costly Unix servers with cheaper servers running
            GNU/Linux.
         c. [526]Amazon.com was able to cut $17 million in technology expenses in a
            single quarter, largely due to a switch to Linux. Amazon spent $54
            million on technology and content expenses in its third quarter (ending
            Sept. 30), compared with $71 million in the year-ago quarter, and
            executives expected that technology costs as a portion of net sales
            would decrease by 20% this year.
         d. [527]The city of Largo, Florida reports a savings of $1 million per year
            using GNU/Linux and "thin clients".
         e. Dell offers a savings of 21% when using GNU/Linux. Dell computer has a
            dedicated hosting service, such as their [528]D-2800 offering. This
            service offers a respectable system (Pentium 850, 256MiB, 20GB,
            21GB/month bandwidth) in two configurations: Red Hat Linux 7.1 for
            $189/month, and Windows 2000 for $239/month. Thus, with identical
            hardware and bandwidth provision, the GNU/Linux system is 21% cheaper.
            This is especially interesting because Dell is not out to prove which
            system is better; as a business, they've just figured out competitive
            prices at which they can offer their services.
         f. [529]An independent report in Denmark concluded that if the political
            goals for using the Internet to improve the public sector are to be
            fulfilled, it would be $500 million cheaper over the next 10 years to
            use FLOSS instead of Microsoft software (my thanks to Poul-Henning Kamp,
            who translated the conclusions).
       There are many other reports from those who have switched to FLOSS systems;
       see the [530]usage reports section for more information.
   23. Even Microsoft has admitted that its products are more costly than GNU/Linux.
       For some time Microsoft has tried to convince users that its products are
       somehow less costly. However, as documented in [531]Var Business and [532]The
       Register, Microsoft CEO Steve Ballmer in 2002 admitted that Microsoft has not
       "figured out how to be lower-priced than Linux. For us as a company, we're
       going through a whole new world of thinking". The Register summarizes
       Microsoft's new approach as saying that "it costs more because it's worth
       more"; whether this is true is rather debatable in many cases, but at least
       it's a more sensible argument. However, Microsoft has gone back to trying to
       claim that they cost less, so the detail in this section is still needed.
   24. A Microsoft-sponsored study claims that Windows is cheaper than Linux, but
       this has been debunked as a general claim. [533]The Microsoft-sponsored study
       ([534]available from Microsoft) compared Windows 2000 to Linux; it stated
       that Linux had lower TCO for web serving, and Windows 2000 had a lower TCO
       for network infrastructure, print serving, file serving and security
       applications (note: the "David Wheeler" quoted in InfoWorld is not the author
       of this paper). I will give credit here: unlike the Mindcraft reports
       sponsored by Microsoft, this TCO report clearly states that it was sponsored
       by Microsoft, and I appreciate that.
       It's important to examine the assumptions of any TCO study, to see if its
       assumptions could apply to many other situations - and it is easily argued
       that they don't. [535]Joe Barr discusses some of the problems in this TCO
       study. These include assuming that the operating system is never upgraded in
       a 5-year period, using an older operating system Microsoft is transitioning
       from, and not using the current Enterprise license agreement (which many
       organizations find they must use). Costs that are not included in the study
       include legal advice costs (when signing large-scale agreements), purchase
       and maintenance of a software license inventory system (which you'll
       generally need even with Enterprise agreements), costs if you are audited,
       cost of insurance and liability incidents (if a proof of purchase is
       misplaced, you might need to pay the $151,000 per-incident liability), and
       paying multiple times for the same product (a side-effect of many Enterprise
       license agreements).
       Barr concludes with: "TCO is like fine wine: it doesn't travel well. What may
       be true in one situation is reversed in another. What gets trumpeted as a
       universal truth ( `Windows is cheaper than Linux' ) may or may not be true in
       a specific case, but it is most certainly false when claimed universally".
       Since the TCO of a system depends on its application, and Microsoft as
       sponsor could specifically set all of the parameters, the conclusions of the
       report were easily predicted.
   25. Another Microsoft-sponsored study claims that Microsoft's toolsuite with .NET
       is cheaper than using GNU/Linux with J2EE. [536]This Giga Research study
       sponsored by Microsoft compared the costs incurred by five large and
       medium-size companies that used J2EE (Java 2 Enterprise Edition) with the
       costs incurred by seven large and medium-size companies that used .Net
       applications to develop Web portal applications. For large corporations, the
       cost of using Microsoft products (for development and deployment plus three
       years of maintenance) was 28% less than for J2EE/Linux. For medium-size
       companies, the Microsoft products were 25% cheaper.
       However, once again, the TCO values all hinge on the assumptions made.
       [537]As CIO.com points out, the Microsoft-based solution was cheaper
       primarily because the GNU/Linux systems were configured using extremely
       expensive proprietary products such as those from Oracle (for the database
       system) and BEA (for the development system).
       A company can certainly choose to use these particular products when
       developing with GNU/Linux, but not all organizations will choose to do so.
       Indeed, the acronym "LAMP" (Linux, Apache, MySQL, and PHP/Python/Perl) was
       coined because that combination is extremely popular when creating web portal
       applications. MySQL and PostgreSQL are popular FLOSS database programs; PHP,
       Python, and Perl are popular FLOSS development languages (and tie easily into
       the rest of the development suite provided by FLOSS operating systems). An
       obvious question to ask is, "Why were extremely common configurations (such
       as LAMP) omitted in this Microsoft-funded study?" CIO.com reports Giga's
       answer: "Microsoft didn't ask them [to] look at any such companies".
       Again, I give credit to Giga for clearly reporting who funded the study.
       Indeed, if your situation closely matches Giga's study, your costs might be
       very similar. But it would be a mistake to conclude that different situations
       would necessarily have the same results.
   26. A 2005 InformationWeek survey reported that GNU/Linux was cheaper than
       mainframe systems, Windows, and Unix according to 70% of the respondents.
       [538]InformationWeek Research Brief "Linux Outlook" published February 2005
       surveyed 439 business technology professionals, and found that "Respondents
       in this study agree that Linux is less expensive. At least seven in 10 sites
       report that Linux is cheaper to operate than mainframe systems, Windows NT,
       Windows 2000 servers, Windows XP servers and Commercial Unix servers.
       Companies also say Linux is a cheaper PC option than Commercial Unix, Windows
       XP or Macintosh. Only PC terminals offer some cost competitiveness".
   27. Georgia Public Library Service's Evergreen program is saving that library
       system over $3 million a year. [539]Linux.com reports that librarians at the
       Georgia Public Library Service (GPLS) have developed the open source,
       enterprise-class library management system called [540]Evergreen for
       large-scale libraries. Evergreen is an Integrated Library System (ILS) --
       meaning that it manages, catalogs, and tracks the circulation of library
       holdings. GPLS looked at existing FLOSS applications; they noted that while
       Koha would work fine for a 10-branch library, none met their needs for
       supporting their large-scale environment, so they wrote their own. Evergreen
       supports GPLS' 252 member libraries - almost the entire U.S. state of Georgia
       - with 8.8 million items in its index and 1.6 active cardholders. It went
       live September 2006, and was "easiest conversion I've ever been through in my
       25 years of working in libraries" by one account.
       Their cost savings came at many levels. Their old system required expensive
       Sun servers, while the new one uses a much cheaper GNU/Linux cluster.
       Replacing the system across their libraries with a proprietary system would
       have cost more than $15 million dollars, plus about $5 million dollars a year
       for maintenance (with 252 libraries, license fees quickly became very
       expensive). They run their Evergreen system for only $1.6 million a year -
       over $3 million in savings annually. Also, by releasing as FLOSS, they can
       share the cost of maintaining and improving the software with others. For
       example, in December 2006 the University of Windsor announced that it was
       officially partnering with GPLS to help add new capabilities to Evergreen.
       Being FLOSS has other advantages, for example, instead of making a request
       and hoping someday the vendor will respond, they can implement what is
       important to them, sometimes literally overnight. Evergreen already includes
       many innovations lacking in many or all proprietary ILS products, such as
       on-the-fly spellcheck, search suggestions, and reviews, as well as allowing
       users to create "bookbags" of selected titles that can be shared with other
       patrons.
       This experience was so successful that [541]Kent County Public Library
       recently switched to Evergreen, with very positive results.

   You may also want to see [542]MITRE Corporation's business case study of OSS,
   which considered military systems.

   Most of these items assume that users will use the software unmodified, but even
   if the FLOSS software doesn't do everything required, that is not necessarily the
   end of the story. One of the main hallmarks of FLOSS software is that it can be
   modified by users. Thus, any true TCO comparison should consider not just the
   products that fully meet the requirements, but the existing options that with
   some modifications could meet the requirements. It may be cheaper to start with
   an existing FLOSS program, and improve it, than to start with a proprietary
   program that has all of the necessary functionality. Obviously, the total TCO
   including such costs varies considerably depending on the circumstances.

   [543]ComputerWorld published an essay by Bernard Golden (Navica) who argues that
   computing ROI when switching from proprietary to FLOSS (such as from Windows to
   Linux) is the worst-case return on investment (ROI) scenario. This is because a
   transition requires retraining and perhaps hiring new personnel, which are always
   expensive propositions. Indeed, in general, any change imposes the costs of
   managing that change.

   [544]Brendan Scott (a lawyer specializing in IT and telecommunications law)
   argues that the long run TCO of FLOSS must be lower than proprietary software.
   Scott's paper makes some interesting points, for example, "TCO is often referred
   to as the total cost of `ownership'... [but] `ownership' of software as a concept
   is anathema to proprietary software, the fundamental assumptions of which revolve
   around ownership of the software by the vendor. ... The user [of proprietary
   software] will, at best, have some form of (often extremely restrictive) license.
   Indeed, some might argue that a significant (and often uncosted) component of the
   cost of `ownership' of proprietary software is that users don't own it at all".
   The paper also presents arguments as to why GPL-like free software gives better
   TCO results than other FLOSS licenses. Scott concludes that "Customers attempting
   to evaluate a free software v. proprietary solution can confine their
   investigation to an evaluation of the ability of the packages to meet the
   customer's needs, and may presume that the long run TCO will favor the free
   software package. Further, because the licensing costs are additional dead weight
   costs, a customer ought to also prefer a free software solution with
   functionality shortfalls where those shortfalls can be overcome for less than the
   licensing cost for the proprietary solution".

   Microsoft's first TCO study comparing Windows to Solaris (mentioned earlier) is
   not a useful starting point for estimating your own TCO. Their study reported the
   average TCO at sites using Microsoft products compared to the average TCO at
   sites using Sun systems, but although the Microsoft systems cost 37% less to own,
   the Solaris systems handled larger databases, more demanding applications, 63%
   more concurrent connections, and 243% more hits per day. In other words, the
   Microsoft systems that did less work cost less than systems that did more work.
   This is not a useful starting point if you're using TCO to help determine which
   system to buy - to make a valid comparison by TCO, you must compare the TCOs of
   systems that meet your requirements. A two-part analysis by Thomas Pfau (see
   [545]part 1 and [546]part 2) identifies this and many other flaws in the study.

   There are some studies that emphasize Unix-like systems, not FLOSS, which claim
   that that there are at least some circumstances where Unix-like systems are less
   costly than Windows. [547]A Strategic Comparison of Windows vs. Unix by Paul
   Murphy is one such paper. It appears that many of these arguments would also
   apply to FLOSS systems, since many of them are Unix-like.

   Be sure that you actually compute your own TCO; don't just accept a vendor's word
   for it, and in particular, don't just accept a vendor's claims for the TCO of its
   competitors. In 2004 Newham council chose Microsoft products over a mixed
   solution, reporting that their selected solution had a lower TCO according to an
   independent study. Yet [548]when the reports were made public in September 2004,
   it was discovered that it was Microsoft who created the cost figures of switching
   to their competitor - not an independent source at all. Any vendor (open or
   closed) can tell you why their competitor costs more money, if you naïvely let
   them.

   Again, it's TCO that matters, not just certain cost categories. However, given
   these large differences in certain categories, in many situations FLOSS has a
   smaller TCO than proprietary systems. At one time it was claimed that FLOSS
   installation took more time, but nowadays FLOSS systems can be purchased
   pre-installed and automatic installers result in equivalent installation labor.
   Some claim that system administration costs are higher, but studies like Sun's
   suggest than in many cases the system administration costs are lower, not higher,
   for Unix-like systems (at least Sun's). For example, on Unix-like systems it
   tends to be easier to automate tasks (because you can, but do not need, to use a
   GUI) - thus over time many manual tasks can be automated (reducing TCO).
   Retraining costs can be significant - but now that GNU/Linux has modern GUI
   desktop environments, there's anecdotal evidence that this cost is actually quite
   small. I've yet to see serious studies quantitatively evaluating this issue, but
   anecdotally, I've observed that people familiar with other systems are generally
   able to sit down and use modern GNU/Linux GUIs without any training at all. In
   short, it's often hard to show that a proprietary solution's purported advantages
   really help offset their demonstrably larger costs in other categories when
   there's a competing mature FLOSS product for the given function.

   One factor that needs to be included in a TCO analysis is switching costs, where
   that applies. Thankfully, most people remember to include the costs of switching
   to something. As noted in [549]"IT analysts' influence on open source adoption",
   Gartner Vice President Mark Driver says that the best place for a company to
   first deploy Linux in a large way is in a new-from-scratch operation rather than
   as a replacement for Windows. That's because, "Gartner's (and other analysts')
   figures show that migration from another operating system and porting software
   written for the old operating system are the two largest costs of a Linux
   migration, [so] it is obvious -- at least to Driver -- that Linux TCO drops
   radically when you avoid the migration step and install Linux in the first
   place".

   However, don't forget to include the extremely important costs of switching away
   from a decision later. As noted in [550]Linux Adoption in the Public Sector: An
   Economic Analysis by Hal R. Varian and Carl Shapiro (University of California,
   Berkeley; 1 December 2003), "a system that will be difficult to switch away from
   in the future, in part because the lock-in associated with using such a system[,]
   will reduce their future bargaining power with their vendor. Vendors always have
   some incentive to make it difficult for users to switch to alternatives, while
   the users will generally want to preserve their flexibility. From the user's
   viewpoint, it is particularly important to make sure that file formats, data,
   system calls, APIs, interfaces, communication standards, and the like are well
   enough documented that it is easy to move data and programs from one vendor to
   another". Obviously, someone who elects to use a proprietary program that locks
   them into that specific program will almost certainly pay much higher prices in
   future updates, because the vendor can exploit the user's difficulty in changing.

   Clearly, if one product is significantly more productive than another where it's
   used, it's worth paying more for it. However, it's clear that at least for major
   office tasks, GNU/Linux systems are about as usable as Windows systems. For
   example, [551]one usability study comparing GNU/Linux to Microsoft Windows XP
   found that it was almost as easy to perform most major office tasks using
   GNU/Linux as with Windows: "Linux users, for example, needed 44.5 minutes to
   perform a set of tasks, compared with 41.2 minutes required by the XP users.
   Furthermore, 80% of the Linux users believed that they needed only one week to
   become as competent with the new system as with their existing one, compared with
   85% of the XP users". [552]The detailed report (in German) is also available.

   Does this mean that FLOSS always have the lowest TCO? No! As I've repeatedly
   noted, it depends on its use. But the notion that FLOSS always has the larger TCO
   is simply wrong.

                                8. Non-Quantitative Issues

   In fairness, I must note that not all issues can be quantitatively measured, and
   to many they are the most important issues. The issues important to many include
   [553]freedom from control by another (especially a single source),
   [554]protection from licensing litigation, [555]flexibility, [556]social / moral
   / ethical issues, and [557]innovation.

    1. FLOSS protects its users from the risks and disadvantages of single source
       solutions. While "free software" advocates use the term "freedom," and some
       businesses emphasize different terms such as "free market", "multiple
       sources", "alternate supply channels", and "the necessity of multiple
       vendors", the issue is the same: users do not want to be held hostage by any
       one vendor. Businesses often prefer to buy products in which there is a large
       set of competing suppliers, because it reduces their risk; they can always
       switch to another supplier if they're not satisfied, the supplier raises
       their prices substantially, or the original supplier goes out of business.
       This translates into an effect on the products themselves: if customers can
       easily choose and switch between competing products, the products' prices go
       down and their quality goes up. Conversely, if there is a near or real
       monopoly for a given product, over time the vendor will continuously raise
       the cost to use the product and limit its uses to those that benefit the
       monopolist. Users who are unwilling to leave single source solutions often
       pay dearly later as their single source raises their costs.
       For example, many organizations have chosen to use Microsoft's products
       exclusively, and Microsoft is trying to exploit this through its new
       "Microsoft Licensing 6.0 Program". The [558]TIC/Sunbelt Software Microsoft
       Licensing Survey Results (covering March 2002) reports the impact on
       customers of this new licensing scheme. 80% had a negative view of the new
       licensing scheme, noting, for example, that the new costs for software
       assurance (25% of list for server and 29% of list for clients) are the
       highest in the industry. Of those who had done a cost analysis, an
       overwhelming 90% say their costs will increase if they migrate to 6.0, and
       76% said their costs would increase from 20% to 300% from what they are
       paying now under their current 4.0 and 5.0 Microsoft Licensing plans. This
       survey found that 36% of corporate enterprises don't have the funds to
       upgrade to the Microsoft Licensing 6.0 Program. Half indicated that the new
       agreement would almost certainly delay their migration initiatives to new
       Microsoft client, server and Office productivity platforms, and 38% say they
       are actively seeking alternatives to Microsoft products. In [559]New Zealand
       a Commerce Commission Complaint has been filed claiming that Microsoft's
       pricing regime is anti-competitive. Craig Horrocks notes that the Software
       Assurance approach does not assure that the purchaser receives anything for
       the money; it merely buys the right to upgrade to any version Microsoft
       releases in the covered period. Microsoft may levy further charges on a
       release, and the contract does not obligate Microsoft to deliver anything in
       the time period.
       There are increasing concerns about Microsoft's latest releases of Windows.
       Michael Jennings argues in [560]Windows XP Shows the Direction Microsoft is
       Going that Microsoft users are increasingly incurring invasion of privacy,
       intentionally crippled yet necessary services, and other problems.
       More generally, defining an organization's "architecture" as being whatever
       one vendor provides is sometimes called [561]"Vendor Lock-in" or
       "Pottersville", and this "solution" is a well-known [562]AntiPattern (an
       AntiPattern is a "solution" that has more problems than it solves). Vendors
       are not foolish; given such power, they may add [563]draconian rules that
       cause problems for users.
       Having only one vendor completely control a market is dangerous from the
       viewpoint of costs (since the customer then has no effective control over
       costs), and it also raises a security concern: the monoculture vulnerability.
       In biology, it is dangerous to depend on one crop strain, because any disease
       can cause the whole crop to fail. Similarly, one proprietary vendor who
       completely controls a market creates a uniformity that is far easier to
       massively attack. FLOSS programs provide an alternative implementation, and
       even when one dominant FLOSS program exists, because they can be changed
       (because the source code is available) at least some implementations are
       likely to be more resistant to attack.
       Historically, proprietary vendors eventually lose to vendors selling products
       available from multiple sources, even when their proprietary technology is
       (at the moment) better. Sony's Betamax format lost to VHS in the videotape
       market, IBM's microchannel architecture lost to ISA in the PC architecture
       market, and Sun's NeWS lost to X-windows in the networking graphics market,
       all because customers prefer the reduced risk (and eventually reduced costs)
       of non-proprietary products. This is sometimes called "commodification", a
       term disparaged by proprietary vendors and loved by users. Since users spend
       the money, users eventually find someone who will provide what they want, and
       then the other suppliers discover that they must follow or give up the market
       area.
       With FLOSS, users can choose between distributors, and if a supplier abandons
       them they can switch to another supplier. As a result, suppliers will be
       forced to provide good quality products and services for relatively low
       prices, because users can switch if they don't. Users can even band together
       and maintain the product themselves (this is how the Apache project was
       founded), making it possible for groups of users to protect themselves from
       abandonment.
       The article [564]Commentary from a new user: Linux is an experience, not an
       operating system, describes freedom this way:

     "As I worked in Linux... the word `free' took on a far greater meaning. As the
     advocates of the Open Source and Free Software movements put it, free means
     freedom. Yes, as a humble user of Linux, I am experiencing freedom and pride
     in using a world-class operating system.

     Linux is not only an operating system. It embodies a myriad of concepts about
     how the world of computers and software should be. This is an operating system
     designed by the world, meant for the world. Everyone who is interested in
     Linux, can develop, share and use it. People can contribute their best in
     programming, documenting or in any aspect of their choice. What a novel
     concept!
     Free in Linux spells freedom -- freedom to use Linux, freedom to use the code,
     freedom to tweak and improve it. Not being a programmer, I still can be happy
     about many things. For me, freedom has meant that my operating system is
     transparent, and there are no hidden codes at work in my computer. Nothing
     about Linux is hidden from me. ... I've gained more control over my computer
     for the first time in my life".
    2. FLOSS protects its users from licensing litigation and management costs.
       Proprietary vendors make money from the sale of licenses, and are imposing
       increasingly complex mechanisms on consumers to manage these licenses. For
       example, Microsoft's Windows XP requires [565]product activation - a scheme
       that means that an accumulation of hardware changes requires a new activation
       code. A license no longer gives unlimited rights to reinstall - if you have
       hardware trouble, you may end up being forced to re-buy your product. Indeed,
       for a variety of reasons, [566]businesses are finding that they must buy the
       same proprietary software more than once.
       Proprietary vendors also litigate against those who don't comply with their
       complex licensing management requirements, creating increased legal risks for
       users. For example, the Business Software Alliance (BSA) is a proprietary
       software industry organization sponsored by Microsoft, Macromedia, and
       Autodesk, and spends considerable time searching for and punishing companies
       who cannot prove they are complying. As noted in the [567]SF Gate (Feb. 7,
       2002), the BSA encourages disgruntled employees to call the BSA if they know
       of any license violations. "If the company refuses to settle or if the BSA
       feels the company is criminally negligent and deliberately ripping off
       software, the organization may decide to get a little nastier and organize a
       raid: The BSA makes its case in front of a federal court in the company's
       district and applies for a court order. If the order is granted, the BSA can
       legally storm the company's offices, accompanied by U.S. marshals, to search
       for unregistered software".
       [568]Software Licensing by Andrew Grygus discusses the risks and costs of
       proprietary licensing schemes in more detail. According to their article,
       "the maximum penalty is $150,000 per license deficiency; typically, this is
       negotiated down, and a company found deficient at around $8,000 will pay a
       penalty of around $85,000 (and must buy the $8,000 in software too)". For
       example, [569]information services for the city of Virginia Beach, VA were
       practically shut down for over a month and 5 employees (1/4th of their staff)
       had to be dedicated to put its licensing in order to answer a random audit
       demand by Microsoft, at a cost of over $80,000. Eventually the city was fined
       $129,000 for missing licenses the city had probably paid for but couldn't
       match to paperwork. [570]Temple University had to pay $100,000 to the BSA, in
       spite of strong policies forbidding unauthorized copying.
       To counter these risks, organizations must keep careful track of license
       purchases. This means that organizations must impose strict software license
       tracking processes, purchase costly tracking programs, and pay for people to
       keep track of these licenses and perform occasional audits.
       A related problem is that companies using proprietary software must, in many
       cases, get permission from their software vendors to sell a business unit
       that uses the proprietary software, or face legal action. For example,
       [571]Microsoft has filed objections to Kmart's proposed $8.4 million sale of
       Bluelight.com to United Online Inc., citing software licensing as one of
       their concerns. Microsoft stated that "The licenses that debtors (Kmart) have
       of Microsoft's products are licenses of copyrighted materials and, therefore,
       may not be assumed or assigned with[out] Microsoft's consent". Whether or not
       this is a risk depends on the licensing scheme used; in many cases it appears
       that the legal "right of first sale" doctrine cannot be applied (for example,
       there are many different licensing schemes for Windows, so the same action
       with Windows may be legal or not depending on the licensing scheme used to
       acquire it).
       In contrast, FLOSS users have no fear of litigation from the use and copying
       of FLOSS. Licensing issues do come up when FLOSS software is modified and
       then redistributed, but to be fair, proprietary software essentially forbids
       this action (so it's a completely new right). Even in this circumstance,
       redistributing modified FLOSS software generally requires following only a
       few simple rules (depending on the license), such as giving credit to
       previous developers and releasing modifications under the same license as the
       original program.
       One intriguing example is [572]the musical instrument company Ernie Ball,
       described in World Trade, May 2002. A disgruntled ex-employee turned them
       into the Business Software Alliance (BSA); who then arranged to have them
       raided by armed Federal Marshals. Ernie Ball was completely shut down for a
       day, and then was required to not touch any data other than what is minimally
       needed to run their business. After the investigation was completed, Ernie
       Ball was found to be non-compliant by 8%; Ball argued that it was "nearly
       impossible to be totally compliant" by their rules, and felt that they were
       treated unfairly. The company ended up paying a $90,000 settlement, $35,000
       of which were Microsoft's legal fees. Ball then decided at that moment his
       company would become "Microsoft free". In one year he converted to a
       Linux-based network and UNIX "mainframe" using Sun's StarOffice (Sun's
       proprietary cousin to OpenOffice); he now has no Microsoft products at all,
       and much of the software is FLOSS or based on FLOSS products.
    3. FLOSS has greater flexibility. FLOSS users can tailor the product as
       necessary to meet their needs in ways not possible without source code. Users
       can tailor the product themselves, or hire whoever they think can solve the
       problem (including the original developer). Some have claimed that this
       creates the "danger of forking," that is, of multiple incompatible versions
       of a product. This is "dangerous" only to those who think competition is evil
       - we have multiple versions of cars as well. And in practice, the high cost
       of maintaining software yourself has resulted in a process in which the
       change is contributed back to the community. If it's not contributed (e.g.,
       it solves a problem that needed solving but only for a specialized
       situation), then it's still a win for the user - because it solved a user's
       problem which would have been unsolved otherwise.
       For example, [573]in 1998 Microsoft decided against developing an Icelandic
       version of Windows 95 because the limited size of the market couldn't justify
       the cost. Without the source code, the Islandic people had little recourse.
       However, FLOSS programs can be modified, so Icelandic support was immediately
       added to them, without any need for negotiation with a vendor. In contrast,
       in [574]July 2004, Welsh support for in the FLOSS OpenOffice.org became
       available, the first complete office environment available in Welsh. Users
       never know when they will have a specialized need not anticipated by their
       vendor; being able to change the source code makes it possible to support
       those unanticipated needs.
       The [575]IDC study "Western European End-User Survey: 2005 Spending
       Priorities, Outsourcing, Open Source, and Impact of Compliance" surveyed 625
       European companies of over 100 employees. They found that 25% had significant
       FLOSS operating system (Linux) deployments (beyond limited deployments or
       pilots), and 33% had significant FLOSS database deployments. The most
       important cited FLOSS benefit wasn't lower cost, but was the flexibility of
       deploying whenever they wanted without having to negotiate anything. In
       addition, many companies specifically stated that a key advantage of FLOSS
       was the flexibility provided because it could be customized; this wasn't one
       of the multiple-choice answers, yet many companies added it as a comment.
    4. Many believe that there are social, moral, or ethical imperatives for using
       FLOSS. The Free Software Foundation has [576]a set of papers describing their
       philosophy, i.e., why they believe Free Software is an ethical imperative.
       These lengthy documents explain themselves in depth, so there's little need
       to describe them further here.
    5. There is ample evidence that FLOSS encourages, not quashes, innovation.
       Innovation is a strength, not a liability, for FLOSS. InformationWeek's
       survey of business-technology professionals [577]"Open-Source Software Use
       Joins The Mix", published in November 2004, found that FLOSS "is believed to
       create more opportunities for innovation than commercial or proprietary
       software". Nearly 60% of the companies with annual revenue of $100 million or
       more stated that FLOSS creates more opportunities for innovation. Small
       businesses (less than $100 million), where much innovation takes place,
       agreed even more strongly; "almost three-quarters report open-source software
       readily promotes more opportunities for IT innovation". A later
       [578]InformationWeek Research Brief "Linux Outlook" published February 2005
       surveyed 439 business technology professionals. In this survey, two-thirds
       contend that open-source spurs more opportunities for technical innovation,
       and half (47%) say it encourages business innovation. This is consistent with
       previous surveys of expectations. The February 2001 research paper
       [579]Distributed Knowledge and the Global Organization of Software
       Development by Anca Metiu and Bruce Kogut (The Wharton School, University of
       Pennsylvania) reports on field observations of companies in four countries.
       They state that, "the open development model opens up the ability to
       contribute to innovation on a global basis. It recognizes that the
       distribution of natural intelligence does not correspond to the
       monopolization of innovation by the richest firms or richest countries. It is
       this gap between the distribution of ability and the distribution of
       opportunity that the web will force companies to recognize, and to realign
       their development strategies. For the young engineer in India, China, or
       Israel - who cannot or does not want to come to the Silicon Valley, or the
       Research Triangle, or Munich - is increasingly able to contribute to world
       innovation". In 2000, a Forrester Research study interviewed 2,500 IT
       managers and found that 84% of them forecast that open source software would
       be the spark behind major innovations throughout the industry
       It's not just business people and observers of them; software developers
       themselves report that FLOSS projects are often innovative. [580]According to
       the BCG study of FLOSS developers, 61.7% of surveyed developers stated that
       their FLOSS project was either their most creative effort or was equally as
       creative as their most creative experience. Government employees also report
       that FLOSS supports innovation; [581]Federal Computer Week (FCW) published
       the article "Linux use drives innovation: FBI info-sharing project is one of
       a growing list of open-source successes". The article declares that the
       "open-source operating system [Linux]'s flexibility allowed engineers greater
       freedom to tailor technology to their needs" and that "Linux is well-suited
       to federal projects with small teams and scarce resources... many Linux
       applications, such as the Census Bureau's Fast Facts service, can support an
       entire enterprise".
       There are many examples showing how innovation FLOSS occurs. Eric S.
       Raymond's widely-read essay [582]The Cathedral and the Bazaar describes one
       case of this happening in his project, fetchmail. He had been developing a
       product to do one job, when [583]a user proposed an approach that changed the
       entire nature of his project. In Raymond's words, "I realized almost
       immediately that a reliable implementation of this feature would make [a
       significant portion of the project] obsolete". He found that "Often, the most
       striking and innovative solutions come from realizing that your concept of
       the problem was wrong" and that "the next best thing to having good ideas is
       recognizing good ideas from your users. Sometimes the latter is better". In
       February 2005, [584]Roman Kagan noted that the Linux kernel "hotplug" system
       could be greatly simplified. The maintainer of the hotplug system, Greg K-H,
       replied by saying [585]"You know, it's moments like this that I really think
       the open source development model is the best. People are able to look into a
       project and point out how stupid the original designers/authors are at any
       moment in time :) You are completely correct, I love your [approach]. With
       it, and a few minor changes ... we don't need _any_ of the module_* programs
       in the hotplug-ng package I just released. That is wonderful, thank you so
       much for showing me that I was just working in circles. The ability to modify
       real-world programs enables all sorts of experimentation; for example,
       [586]Symphony OS is a modified GNU/Linux distribution designed to try out a
       radically new approach to user interfaces; from a plethora of experiments,
       the successful ones get included in future versions. In short, FLOSS enables
       interaction between developers and users, as well as interaction between
       developers, that can encourage innovation.
       There's even a whole book about this; [587]Innovation Happens Elsewhere: Open
       Source as Business Strategy by Ron Goldman and Richard P. Gabriel notes the
       value of FLOSS for innovation.
       This is not a new phenomenon; many key software-related innovations have been
       FLOSS projects. For example, [588]Tim Berners-Lee, inventor of the World Wide
       Web, stated in December 2001 that "A very significant factor [in widening the
       Web's use beyond scientific research] was that the software was all (what we
       now call) open source. It spread fast, and could be improved fast - and it
       could be installed within government and large industry without having to go
       through a procurement process". The Internet's critical protocols, such as
       TCP/IP, have been developed and matured through the use of FLOSS. The
       [589]Firefox web browser has some very interesting innovations, such as
       [590]live bookmarks (making RSS feeds look just like bookmark folders, and
       enabling simple subscription), as well as incorporating innovations from
       other browsers such as tabbed browsing and pop-up blocking. Indeed, [591]many
       people are working hard to create new innovations for the next version of
       Firefox.
       Leading innovation expert [592]Professor Eric von Hippel is the head of the
       management of innovation and entrepreneurship group at the Massachusetts
       Institute of Technology (MIT) Sloan School of Management. He has studied in
       detail how innovation works, including how it works in the development of
       FLOSS programs. His studies suggest that FLOSS can significantly enable
       innovation. In the interview [593]Something for nothing of von Hippel and
       Karim Lakhani, they report that "Apache and other open-source programs are
       examples of user-to-user innovation systems". von Hippel explained that
       "Users may or may not be direct customers of the manufacturer. They may be in
       different industries or segments of the marketplace, but they are out in the
       field trying to do something, grappling with real-world needs and concerns.
       Lead users are an innovative subset of the user community displaying two
       characteristics with respect to a product, process or service. They face
       general needs in a marketplace but face them months or years before the rest
       of the marketplace encounters them. Since existing companies can't customize
       solutions good enough for them, lead users go out there, patch things
       together and develop their own solutions. They expect to benefit
       significantly by obtaining solutions to their needs. When those needs are
       evolving rapidly, as is the case in many high-technology product categories,
       only users at the front of the trend will have experience today with
       tomorrow's needs and solutions. Companies interested in developing
       functionally novel breakthroughs... will want to find out how to track lead
       users down and learn from what they have developed..". He closes noting that,
       "We believe Apache and open source are terrific examples of the lead user
       innovation process that can take teams and companies in directions they
       wouldn't have otherwise imagined". von Hippel has elsewhere noted that in
       certain industries approximately 80% of new developments are customer based;
       vendors ignore customers at their peril. For more information on this work
       relating to FLOSS, innovation, and user interaction, see Nik Franke and Eric
       von Hippel's [594]Satisfying Heterogeneous User Needs via Innovation
       Toolkits: The Case of Apache Security Software, Karim Lakhani and Eric von
       Hippel's [595]How Open Source Software Works: Free User to User Assistance,
       Eric von Hippel's [596]Horizontal innovation networks- by and for users, Eric
       von Hippel and Georg von Krogh's [597]Exploring the Open Source Software
       Phenomenon: Issues for Organization Science (which proposes that FLOSS
       development is a compound innovation model, containing elements of both
       private investment and collective action), and Eric von Hippel's [598]Open
       Source Shows the Way - Innovation By and For Users - No Manufacturer
       Required.
       Other academics who study innovation have come to similar conclusions:
          + Joachim Henkel (at Germany's University of Munich, Institute for
            Innovation Research) wrote the paper [599]"The Jukebox Mode of
            Innovation - a Model of Commercial Open Source Development". In it, he
            creates a model of innovation in software, and finds that "free
            revealing of innovations is a profit-maximizing strategy... a regime
            with compulsory revealing [e.g., copylefting licenses] can lead to
            higher product qualities and higher profits than a proprietary regime".
            Tzu-Ying Chan and Jen-Fang Lee (at Taiwan's National Cheng Chi
            University of Technology & Innovation Management) wrote [600]"A
            Comparative Study of Online User Communities Involvement In Product
            Innovation and Development", which identified a number of different
            types of online user communities. They discussed in particular the "user
            product collaboration innovation community", noting that firms must play
            a supporting/complementary role for effective interactions with this
            community, a role very different from its interactions with many other
            kinds of communities.
          + Alessandro Nuvolari's peer-reviewed paper "Open source software
            development: Some historical perspectives" provides evidence that open
            source software is a case of what Robert C. Allen has termed "collective
            invention". In "collective invention" settings, "rival firms (or
            independent individual developers) freely release one another pertinent
            information concerning the solution of non-trivial technical problems.
            Each firm, in turn, makes use of the this information to incrementally
            improve on a basic common technological layout". Nuvolari's paper
            compares open source software with two episodes of nineteenth century
            technical advances to demonstrate this point. He concludes that in
            industries "where the dynamics of technological change display a
            cumulative and incremental character, the protection of "commons" of
            freely accessible knowledge is likely to yield much higher rates of
            innovation than the enforcement of strong intellectual property rights".
            Nuvolari is assistant professor in the Economics of Science and
            Technology at the Eindhoven University of Technology, the Netherlands
            and research fellow at the Eindhoven Centre for Innovation Studies.
       In 2011, [601]Brian Proffitt's "The new draw of open source: innovation: Open
       source isn't just alternative; for cutting-edge tech, it's the only game in
       town", pointed out that "there are things in open source software that you
       cannot get anywhere else".
       Yuwei Lin's PhD thesis (at the UK's University of York, Science and
       Technologies Studies Unit, Department of Sociology), [602]Hacking Practices
       and Software Development: A Social Worlds Analysis of ICT Innovation and the
       Role of Free/Libre Open Source Software examines the social world of FLOSS
       developers and its implications. Its major findings are (I quote but use
       American spelling):
         1. As a community of open source practices, the FLOSS social world allows
            diverse actors to engage in the innovation process and therefore
            contains more innovation resources than other relatively conventional
            software models.
         2. The strategic collaboration between the public (i.e., the free software
            community) and the private (i.e., information technologies corporations)
            sectors symbolizes a pattern of hybrid innovation that entails complex
            communications and networks.
         3. Tacit knowledge anchored in everyday experiences is peculiarly valued in
            a community-based innovation system where social networking and
            information sharing are undergoing vigorously.
         4. The development of FLOSS democratizes [the] software innovation process
            and allows lay people to develop their understanding and knowledge of a
            shared problem/issue, especially through the web, to challenge
            established views on the issue.
       On September 14, 2004, [603]The Economist (a highly respected magazine)
       awarded Linus Torvalds an award for innovation, specifically as someone
       driving the most financially successful breakthrough in computing, for his
       work on the Linux kernel. His citation declares that this FLOSS project
       "created a huge following, eventually attracting big industry players such as
       Oracle, IBM, Intel, Netscape and others. It also spawned several new software
       companies, including Red Hat, SUSE LINUX and Turbolinux. Today, there are
       hundreds of millions of copies of Linux running on servers, desktop
       computers, network equipment and in embedded devices worldwide". The
       Committee for Economic Development (a 60-year-old pro-business think tank)
       [604]reports that "Open source software is increasingly important as a source
       of innovation; it can be far more reliable and secure than proprietary
       software because talented programmers around the world can examine the code
       and try to break its security, without having to worry about hidden backdoors
       or holes".
       This history of innovation shouldn't be surprising; FLOSS approaches are
       based on the scientific method, allowing anyone to make improvements or add
       innovative techniques and then make them immediately available to the public.
       [605]Eric Raymond has made a strong case for why innovation is more likely,
       not less likely, in FLOSS projects.
       Clearly, if you have an innovative idea, FLOSS makes it very easy to combine
       pre-existing code in novel ways, modifying them and recombining them in any
       way you wish. Hosting systems such as SourceForge and Savannah provide easy
       access to vast amounts of source code. There's even a [606]specialized search
       engine to find FLOSS code named [607]Koders.com, allowing for quick reuse of
       a variety of components. This unfettered access to source code for arbitrary
       purposes, without royalty restrictions, makes it easy to try out new ideas.
       [608]The Reuters story "Plugged in - Next Big Tech Ideas May Be Small Ones"
       by Eric Auchard (April 2, 2005) notes that FLOSS has reduced (by orders of
       magnitude) the cost of implementing new ideas, making it easier to start new
       businesses and products so that they can be brought to the marketplace.
       If you look at the actual scientific experiments on innovation, you find very
       surprising results, ones that help illuminate why FLOSS has so much
       innovation. [609]RSA Animate - Drive: The surprising truth about what
       motivates us is an especially approachable summary of that research.
       Basically, simple rewards (like "more money") and punishments ("you're
       fired") results in more and better results for tasks that are simple, and
       straightforward - where you just follow pre-set rules. But when a task gets
       more complicated -- when it requires some conceptual, creative thinking (like
       developing software), these kinds of motivators do not work. Yes, they need
       enough money to live, etc., but humans need more than money. Instead, the
       three factors that lead to better performance (and personal satisfaction) are
       autonomy, mastery, and purpose. FLOSS development is often better at enabling
       these factors, and thus can be especially good at unlocking a lot of
       innovation.
       In public, Microsoft has long asserted that FLOSS cannot innovate, or at
       least cannot innovate as well as Microsoft can. At first, the argument seems
       reasonable: why would anyone innovate if they (or at least their company)
       couldn't exclusively receive all the financial benefits? But while the
       argument seems logical, it turns out to be untrue. In February 2003,
       [610]Microsoft's Bill Gates admitted that many developers are building
       innovative capabilities using FLOSS systems. Microsoft's own secret research
       (later leaked as [611]"Halloween I") found that "Research/teaching projects
       on top of Linux are easily `disseminated' due to the wide availability of
       Linux source. In particular, this often means that new research ideas are
       first implemented and available on Linux before they are available /
       incorporated into other platforms". In contrast, when examining [612]the most
       important software innovations, it's quickly discovered that Microsoft
       invented no key innovations, nor was Microsoft the first implementor of any
       of them. In fact, [613]there is significant evidence that Microsoft is not an
       innovator at all. Thus the arguments, while sounding logical, ignore how
       innovation really occurs and what researchers say are necessary. Innovation
       requires that researchers be able to publish and discuss their work, and that
       leading-edge users be able to modify and integrate components in novel ways;
       FLOSS supports these requirements for innovation very well.
       If proprietary approaches were better for research, then you would expect
       that to be documented in the research community. However, the opposite is
       true; the paper [614]"NT Religious Wars: Why Are DARPA Researchers Afraid of
       Windows NT?" found that, in spite of strong pressure by paying customers,
       computer science researchers strongly resisted basing research on Microsoft
       Windows. Reasons given were: developers believe Windows is terrible, Windows
       really is terrible, Microsoft's highly restrictive non-disclosure agreements
       are at odds with researcher agendas, and there is no clear technology
       transition path for OS and network research products built on Windows. This
       last problem is especially interesting: you'd think that if you could improve
       a popular product, the improvement would get to users more quickly. But
       innovation doesn't work this way usually; most research creates prototypes
       that aren't products by themselves, and requires significant interaction
       between many people before the idea comes to fruition. In proprietary
       products, usually only the vendor can distribute changes, and publishing the
       detailed source code explaining the work is prohibited, stifling research. In
       contrast, NSA's Security-Enhanced Linux (SELinux) project could simply take
       GNU/Linux code, modify it however they liked to try out new concepts, and
       publish all the results for anyone to productize. In contrast, if an
       innovation requires the cooperation of a proprietary vendor, it may not
       happen at all. [615]HP developed new technology for choking off the spread of
       viruses, but although HP got it to work well in its labs using systems like
       Linux, they couldn't duplicate the capability on Windows systems because "we
       [HP] don't own Windows". Stanford Law School professor Lawrence Lessig (the
       "special master" in Microsoft's antitrust trial) noted that [616]"Microsoft
       was using its power to protect itself against new innovation" and that
       Microsoft's practices generally threaten technical innovation - not promote
       it.
       The claim that FLOSS quashes innovation is demonstrably false. There are
       reports from IT managers that FLOSS encourages innovation, reports from
       developers that FLOSS encourages innovation, and a demonstrated history of
       innovation by FLOSS (such as in the development of the Internet and World
       Wide Web). In contrast, Microsoft fails to demonstrate major innovations
       itself, there is dissatisfaction by researchers and others about Microsoft's
       proprietary approaches, and Microsoft's own research found that new research
       ideas are often first implemented and available on FLOSS.
       Indeed, the use of FLOSS-like approaches to spur innovation have spread far
       beyond software. [617]Wired noted development in 2003 of a new medical system
       to save lives, which was developed using approaches similar to FLOSS. In
       short, many people from various backgrounds worked together, with remarkable
       results.
       This doesn't mean that having or using FLOSS automatically provides
       innovation, and certainly proprietary developers can innovate as well. And
       remember that innovation is not as important as utility; new is not always
       better! But clearly FLOSS does not impede innovation; the evidence suggests
       that in many situations FLOSS is innovative, and there is evidence suggesting
       that FLOSS actively aids innovation.

   While I cannot quantitatively measure these issues well, these issues are
   actually the most important issues to many.

                                   9. Unnecessary Fears

   Some avoid FLOSS, not due to the issues noted earlier, but due to unnecessary
   fears of FLOSS. Let's counter some of them:
    1. Is proprietary software fundamentally better supported than FLOSS? No. There
       are actually two kinds of support for FLOSS: traditional paid-for support and
       informal community support. It's also important to note that proprietary
       vendors often drop support for their products over time; there is no real
       recourse for proprietary products users, while there are recourses for FLOSS
       users.
       There are many organizations who provide traditional support for FLOSS for a
       fee; since these can be competed (an option not available for proprietary
       software), you can often get an excellent price for support. Again, an
       anti-trust lawyer would say that FLOSS support is "contestable". For example,
       many GNU/Linux distributions include installation support when you purchase
       their distribution, and for a fee they'll provide additional levels of
       support; examples of such companies include [618]Red Hat, [619]Novell (SuSE),
       [620]Mandriva (formerly MandrakeSoft), and [621]Canonical Ltd (which supports
       [622]Ubuntu, a derivative of Debian GNU/Linux). There are many independent
       organizations that provide traditional support for a fee as well. Some
       distributions projects are actively supported by a large set of companies and
       consultants you can select from; examples include [623]Debian GNU/Linux and
       [624]OpenBSD. The article [625]`Team'work Pays Off for Linux evaluated four
       different technical support services for GNU/Linux systems, and found that
       "responsiveness was not a problem with any of the participants" and that "No
       vendor failed to solve the problems we threw at it". Many other organizations
       exist to support very specific products; for example, Mozilla Firefox and
       Thunderbird support available from [626]decisionOne and [627]MozSource, for
       many years [628]AdaCore (aka AdaCore Technologies or ACT) has sold commercial
       support for the FLOSS Ada compiler GNAT, and [629]MySQL AB sells commercial
       support for its FLOSS relational database system. It's very important to
       understand that FLOSS support can be competed separately from the software
       product; in proprietary products, support is essentially tied to purchase of
       a usage license.
       In the meantime, users can minimize any `fitness for purpose' risks through
       evaluation and testing, and by only using production releases of well-known,
       mature products from reputable distributors". Indeed, this prediction seems
       nearly certain, since it's been happening and accelerating for years.
       As an alternative to paid support, you can also get unpaid support from the
       general community of users and developers through newsgroups, mailing lists,
       web sites, and other electronic forums. While this kind of support is
       non-traditional, many have been very satisfied with it. Indeed, in 1997
       InfoWorld awarded the "Best Technical Support" award to the "Linux User
       Community," beating all proprietary software vendors' technical support. Many
       believe this is a side-effect of the Internet's pervasiveness - increasingly
       users and developers are directly communicating with each other and finding
       such approaches to be more effective than the alternatives (for more on this
       business philosophy, see [630]The Cluetrain Manifesto). Using this
       non-traditional approach effectively for support requires following certain
       rules; for information on these rules, consult [631]"How to ask smart
       questions" and [632]How to Report Bugs Effectively. But note that there's a
       choice; using FLOSS does not require you to use non-traditional support (and
       follow its rules), so those who want guaranteed traditional support can pay
       for it just as they would for proprietary software.
       Indeed, proprietary software is often informally supported as well. User
       groups, magazines, and various organizations have been stood up over many
       years to support proprietary products, even ones that in theory have a formal
       support channel. This shows that formal support is often not effective,
       certainly not as effective as the proprietary vendors wish to pretend. But
       unlike proprietary software, non-traditional FLOSS support organizations have
       direct access to the source code and development information - which means
       they can be much more effective.
       And it's important to remember that for a proprietary product, the vendor can
       at any time decide to end support for a product -- while there is always an
       alternative for FLOSS users. This is especially a risk if a company goes out
       of business, is bought out, changes to a different market, or if the market
       becomes too small. But this can happen even when the company is profitable,
       doesn't change its basic market, the market is large, and there are many
       established users. After all, the vendor may have priorities not aligned with
       yours, and the vendor is usually the only organization that may make
       improvements and sell the product.
       An extreme example of how a commercial vendor can abandon its users has been
       [633]Microsoft's abandonment of the vast number of companies who use Visual
       Basic 6. Many large organizations have developed large infrastructures that
       depend on Visual Basic 6, and [634]one survey reports that 52% of all
       software developers use Visual Basic (at least occasionally); one developer
       [635]estimates that this plan abandons about 18 million software developers,
       of which an estimated 6 million are professionals, who developed tens of
       millions of Visual Basic applications. When Microsoft developed its ".NET"
       infrastructure, it also created a new product that it called "Visual Basic
       for .NET" (VB.NET). Unfortunately, VB.NET is completely incompatible with the
       Visual Basic 6 language so widely used by industry, so the millions of lines
       of code written using Visual Basic over many years cannot be used with VB.NET
       without essentially rewriting the programs from scratch. (the [636]migration
       wizard is essentially useless because there are just so many
       incompatibilities). A [637]former Microsoft VB product manager, Bill Vaughan,
       coined the name "Visual Fred" for VB.NET to emphasize how different the new
       product was from the old one, and [638]the term "Visual Fred" for VB.NET
       rapidly caught on. This is an enormous expense; if it takes on average $4,000
       to to rewrite a Visual Basic application, and only 10% of an estimated 30
       million applications need to be rewritten, that means customers will end up
       paying $12 billion dollars just to rewrite their software (without new
       functionality). Surveys show that Visual Basic 6 is still far more popular
       than VB.NET; [639]a 2004 survey found that 80% used Visual Basic 5 or 6,
       while only 19% used VB.NET. A [640]protest petition has been signed by more
       than 2,000 people (including 222 MVPs), and many companies have complained
       about the enormous and completely unnecessary expense of rewriting their
       programs just because Microsoft stopped supporting the original language.
       Nevertheless, Microsoft has decided to abandon Visual Basic 6 (mainstream
       support for VB6 ends on March 31, 2005), in spite of the outcry from most of
       its users. Since there never was a standard for Visual Basic, and its
       implementation is proprietary without obvious alternatives, Visual Basic 6
       users are stuck; they cannot take over development themselves, as would be
       possible for an FLOSS program. Instead, the majority of Visual Basic
       developers are [641]switching to other languages, primarily C# and Java. For
       example, [642]Evans Data found that of those who weren't staying with Visual
       Basic 6, only 37% of Visual Basic 6 users planned to switch to VB.NET; 31%
       said they plan to move to Java and 39% said they will be migrating to C#. You
       can see [643]ClassicVB.org for more information. This has the ire of many who
       normally support Microsoft; [644]Kathleen Dollard said, "It is unconscionable
       (and should be illegal) for Microsoft to end mainstream support until
       everyone who made a good faith effort in light of their business environment
       has made the switch" You could say that [645]this extreme unwanted expense
       was the just punishment for developers who unwisely chose to use a language
       with no standard, no alternative implementation, and no mechanism to gain
       support if the vendor decided to stop supporting the original product. But
       this is little consolation for those many who have programs written in the
       now-abandoned Visual Basic 6, since they cannot be handled by the new VB.NET.
       In contrast, many FLOSS programs have been "abandoned" or had major changes
       in strategy contrary to their user's interests, but support did not end.
       Apache grew out of the abandonment of the NCSC web server program -- users
       banded together and restarted work, which quickly became the #1 web server.
       The GIMP was abandoned by its original developers, before it had even been
       fully released; again, users banded together and re-founded the project. The
       XFree86 project changed its licensing approach to one incompatible with many
       customer's requirements and failed to respond to the needs of many users;
       this led to the founding of another project that replaced it. Of course, if
       you are the only user of an FLOSS project, it may not be worth becoming the
       lead of a "follow-on" project -- but you at least have the right to do so. An
       FLOSS project cannot work too far against the interests of its users, because
       the users can wrest control away from those who try.
    2. Does proprietary software give users more legal rights than FLOSS? Or, isn't
       FLOSS legally more risky? No. Some have commented that "with FLOSS you give
       up your right to sue if things go wrong". The obvious retort is that
       essentially all proprietary software licenses also forbid lawsuits - so this
       isn't different at all! Anyone who thinks that they can sue Microsoft or
       other shrink-wrap proprietary vendors when things go wrong is simply fooling
       themselves. In any case, most users aren't interested in suing vendors - they
       want working systems. See [646]"A Senior Microsoft Attorney Looks at
       Open-Source Licensing", where Bryan Pfaffenberger argues that "With
       open-source software... you are, in principle, walking into the deal with
       your eyes wide open. You know what you're getting, and if you don't, you can
       find someone who does. Open-source licenses enable the community of users to
       inspect the code for flaws and to trade knowledge about such flaws, which
       they most assuredly do. Such licenses allow users to create derivative
       versions of the code that repair potentially hazardous problems the author
       couldn't foresee. They let users determine whether the program contains
       adequate safeguards against safety or security risks. In contrast, the
       wealthy software firms pushing UCITA are asking us to buy closed-source code
       that may well contain flaws, and even outright hazards attributable to
       corporate negligence - but they won't let us see the code, let alone modify
       it. You don't know what you're getting". Finally, if the software goes wrong
       and it's very important, you can fix it yourself or pay to have it fixed;
       this option greatly reduces risk, and this option doesn't exist for
       proprietary software.
       There is a another legal difference that's not often mentioned. Many
       proprietary programs require that users permit software license audits and
       pay huge fees if the organization can't prove that every use is licensed. So
       in some cases, if you use proprietary software, the biggest legal difference
       is that the vendors get to sue you.
       There are some claims that FLOSS creates special risks to users, but this
       doesn't seem to be true in practice. [647]Pillsbury Winthrop LLP noted that
       "The suggestion that users of [FLOSS] software are more likely to be sued for
       patent infringement than those that use proprietary software, like
       Microsoft's does not appear supported by actual experience. It is interesting
       to note that while Microsoft has had several dozen patent infringement
       lawsuits filed against it in the past few years, none have been reported
       against Linux, the most popular of all [FLOSS] programs". [648]Linda M.
       Hamel, General Counsel, Information Technology Division, Commonwealth of
       Massachusetts concluded that "Use of either open source or proprietary
       software poses some legal risk to states. States face fewer risks in
       connection with the use of open source software compared to their private
       sector counterparts, and the risks that they do face can be managed".
       ([649]Groklaw further commented on this). On February 7, 2005,
       [650]BusinessWeek published an opinion piece by by Stuart Cohen of the Open
       Source Development Lab (OSDL); in that piece, he stated that SCO's attempt to
       sue IBM on Linux-related issues resulted in accelerating its popularity and
       strengthening its legal foundation. He noted that many Linux developers,
       assisted by such interested parties, went to work to systematically examine
       every claim SCO put forth, and they investigated and vetted the code in great
       depth.
       [651]"Best Legal Practices for Open Source Software" by Dennis Kennedy
       (February 7, 2006) concludes with "Don't be an Open Source ostrich. Open
       Source software is not likely to go away nor are you likely to avoid it".
       Indeed, he notes that that "It's easy to find frantic concerns about Open
       Source software over reasons that apply just as easily to [proprietary]
       software". He believes a bad on FLOSS is probably "impractical and unwise";
       instead, "a reasonable, evolving set of policies and procedures crafted to
       fit the business needs and corporate risk comfort level of your company will
       invariably be the best approach to take".
    3. Aren't FLOSS programs simply plagiarized proprietary programs? No. A
       programmer who has access to the source code of one program could illegally
       take that code and submit it to another related program. There are good
       reasons to believe this has happened many times in proprietary programs;
       since few people can view the source code of two different proprietary
       programs, some programmers may do it in the (plausible) belief that they
       won't be caught. However, it's unlikely that a programmer would copy code
       from a proprietary program to an FLOSS program without permission, because
       (1) the worldwide visibility of most FLOSS source code would make it easy for
       a proprietary vendor to detect the violation, and (2) the clear record of
       exactly who submitted the plagiarized code would make it easy to prosecute
       that lawbreaking programmer.
       A proprietary company could conceivably conspire to insert such code to try
       to discredit their FLOSS competitor. But the risk of tracing such an attack
       back to the conspirator is very great; the developer who does it is likely to
       talk and/or other evidence may provide a trace back to the conspirators.
       Alternatively, a proprietary company can claim that such an event has
       happened, without doing it, and then use the false claim to spread fear,
       uncertainty, and doubt. But in that case, eventually the case will fall apart
       due to lack of evidence.
       A few years ago The SCO Group, Inc., began claiming that the Linux kernel
       contained millions of lines of its copyrighted code, and sued several
       companies including IBM. SCO has vocally supported several lawsuits, funded
       at least in part by Microsoft (via Baystar and a license purchase with no
       evidence that it will be used). Yet after repeatedly being ordered by a court
       to produce its evidence, [652]SCO has yet to produce any evidence that code
       owned by SCO has been copied into the Linux kernel. Indeed, it's not even
       clear that SCO owns the code it claims to own (it's in dispute with Novell on
       this point). In addition, [653]Open Source Risk Management (OSRM) did a
       detailed code analysis, and certified in April 2004 that the Linux kernel is
       free of copyright infringement. SCO claims that its contracts with IBM give
       it ownership over IBM-developed code, but previous documents relating to this
       contract inherited by SCO (such as newsletter explanations from AT&T and a
       previous court case involving BSD) give extremely strong evidence that this
       is not true. More information on the SCO vs. IBM case can be found at
       [654]Groklaw.net.
       In 2004 Ken Brown, President of Microsoft-funded ADTI, claimed that Linus
       Torvalds didn't write Linux, and in particular claimed that Torvalds stole
       much of his code from Minix. Yet it turns out that ADTI had previously hired
       Alexey Toptygin to find copying between Minix and Linux using automated
       tools, and [655]Toptygin found that no code was copied from Minux to Linux or
       from Linux to Minux. Andrew Tanenbaum, the author of Minix, strongly refuted
       Brown's unsubstantiated claims in a [656]statement, [657]follow-up, and
       [658]rebuttal. For example, Tanenbaum stated that "[Linus Torvalds] wrote
       Linux himself and deserves the credit". Tanenbaum also discredited Brown's
       claim that no one person could write a basic kernel; Tanenbaum noted that
       there are "six people I know of who (re)wrote UNIX [and] all did it
       independently". [659]Other reports find many reasons to believe that ADTI's
       claims are false; for example, the Associated Press noted that [660]Recent
       attacks on Linux come from dubious source.
       There are a vast number of FLOSS programs, almost none of which are involved
       in any dispute. No reasonable evidence has surfaced to justify the most
       publicized claims (of SCO and ADTI); these claims can be easily explained as
       attempts by a vendor to stall a competitor through the courts (see the terms
       barratry and vexatious litigation) and unfounded claims. There may be some
       cases, but given the widespread visibility of FLOSS source code, and the lack
       of plausible cases, they must be extremely rare. Thus, there is strong
       evidence that people really are (legally) developing FLOSS programs, and not
       simply copying program source code illegally from proprietary programs.
    4. Does FLOSS expose you to greater risk of abandonment? No. Businesses go out
       of business, and individuals lose interest in products, in both the
       proprietary and FLOSS world. A major difference, however, is that all FLOSS
       programs are automatically in escrow - that is, if their original developer
       stops supporting the product, any person or group can step forward to support
       it instead. This has been repeatedly demonstrated in FLOSS. For example, the
       [661]GIMP is a bitmapped graphical editor that was abandoned by its original
       developers (what's worse, they abandoned it before its initial release and
       failed to arrange for anyone else to succeed them). Nevertheless, even in
       this worst-case situation, after a period of time other users came forward
       and continued its development. As another example, [662]NCSA abandoned its
       web server "httpd", so some of its users banded together to maintain it - its
       results became Apache, the world's most popular web server.
    5. Are FLOSS licenses enforceable? In particular, is the GPL enforceable? Almost
       all FLOSS programs are released under some sort of license, and the most
       popular license is the GPL. A few competitors have claimed, in the past, that
       these licenses -- in particular the GPL -- are unenforceable. But legal
       scholars and lawyers who look into the issue generally scoff at such
       arguments.
       Eben Moglen (professor of law at Columbia University Law School and general
       counsel of the Free Software Foundation) wrote an article titled
       [663]Enforcing the GNU GPL, where he describes why the GPL is so easy to
       enforce -- and why he's been able to enforce the GPL dozens of times without
       even going to court. At the time, he stated that "We do not find ourselves
       taking the GPL to court because no one has yet been willing to risk
       contesting it with us there".
       Eben Moglen also gave a [664]keynote address at the University of Maine Law
       School's Fourth Annual Technology and Law Conference, Portland, Maine, June
       29, 2003, where he explains why it's so easy to enforce the GPL. He explains
       it this way: "because of the structure of my license, the defendant's
       obligation [is] affirmatively to plead it, if she wants to. After all, if she
       is distributing, it is either without license, in which case my license
       doesn't get tested -- there's an unlicensed distribution going on and it's
       enjoinable -- or the license is pled by the other side .... how
       interesting... For ten years, I did all of the GPL enforcement work around
       the world by myself, while teaching full time at a law school. It wasn't
       hard, really; the defendant in court would have had no license, or had to
       choose affirmatively to plead my license: they didn't choose that route.
       Indeed, they didn't choose to go to court; they cooperated, that was the
       better way... We got compliance all the time".
       [665]In 2004, the GPL was finally tested in court and found valid. On 14
       April 2004, a three-judge panel in German Munich court granted a preliminary
       injunction to stop distribution of a Sitecom product that was derived from
       the GPL, yet failed to comply with the GPL. (see also the French article
       [666]La licence GPL sur un logiciel libre n'est pas une demi-licence!). Soon
       afterwords, Sitecom Chief Executive Pim Schoenenberger said the company made
       changes to comply with the GPL. The preliminary injunction was later
       [667]confirmed on July 23, 2004, [668]along with a significant judgement.
       John Ferrell of law firm Carr & Ferrell declared that this German decision
       lends weight to the GPL, and that it "reinforces the essential obligations of
       the GPL by requiring that if you adopt and distribute GPL code, you must
       include the GPL license terms and provide source code to users," just as its
       license requires.
       In the U.S., the case Drew Technologies, Inc. v. Society of Automotive
       Engineers, Inc. (SAE) (Civil Action No. 03-CV-74535 DT, U.S. District Court,
       Eastern District of Michigan) involved GPL software. [669]A 2005 settlement
       left intact a GPL program's software license. While not as clear a judgement
       for the GPL as above, the judge clearly took the license seriously, and did
       not allow the license to simply be nullified.
       The license requirements for common FLOSS licenses are actually easy to
       comply with, but there is significant evidence that those terms are
       enforceable. Which is good news for FLOSS users; clear, simple, and
       consistent requirements make it easy to understand what to do. For developers
       who depend on licenses like the GPL to keep the code available for
       improvement, this is also good news.
    6. Are there special legal rules about incorporating FLOSS into my programs? No,
       fundamentally the same rules apply whether you incorporate proprietary or
       open source software into your program. Fundamentally, you may only include
       software developed by someone else into your software if you have a license
       that permits you to do so, and you must follow the requirements of that
       license. For most proprietary programs, this can only be done by paying
       per-unit royalty fees and/or limiting your use (such as only using it for
       educational purposes). If you fail to obey those rules, you can be taken to
       court for damages, regardless of whether it's proprietary or FLOSS.
       Many proprietary programs include open source software, so it's obviously
       possible to do this legally. Microsoft Windows includes FLOSS components
       ([670]such as components from the University of California, Berkeley and its
       contributors which implement Internet-related capabilities), as does
       Microsoft Office (it uses [671]zlib).
       However, just as with proprietary software, you must examine the license
       first before you reuse someone else's software. Some FLOSS programs have use
       licenses such as BSD, MIT, and similar that explicitly permit you to reuse
       software in your system without any royalty fees as long as you follow some
       simple rules. However, you still have to follow rules, for example, some
       require some sort of credit in the documentation or code itself. These are
       very low-cost requirements, and meeting them is far cheaper than writing the
       software yourself!
       The most common FLOSS license is the GPL, which allows you to use the
       software in arbitrary ways. However, the GPL strictly limits how you're
       allowed to combine the software with proprietary software (it does prohibit
       certain actions). The GPL also requires release of the source code to the
       recipients of the binary. We'll discuss the GPL more in the next point.
       Karen Faulds Copenhaver of Black Duck Software's [672]"Reviewing Use of OSS
       in the Enterprise" discusses various myths, including the once-common myth
       that "You cannot use open source software in a proprietary environment".
       Instead, she notes that from a developer's perspective, FLOSS and proprietary
       code have essentially the same issues: you must understand and fulfill your
       license obligations, Indeed, she believes that FLOSS compliance will
       generally be must easier, and that the risk of enforcement is far higher from
       proprietary code though the same remedies apply (see slide 18). Thus, by
       slide 19 she notes that organizations developing software of any kind
       (whether or not the software uses FLOSS components) must know what code is in
       the code base, must know the obligations of all licensed materials used (so
       they can fulfill them), and must know whether or not the license obligations
       of the various components are compatible. They note that organizations who
       are developing software should embrace FLOSS (slide 36), but when they do,
       they should meet the obligations of them.
       Sometimes these licenses will be a deciding factor. For example, there are
       two common GUI toolkits on Linux-based systems: Gtk+ and Qt. Gtk+ is released
       under the LGPL license, and thus can be used by both FLOSS and proprietary
       programs without any royalty payments. Qt is available freely under a GPL
       license, and for a royalty fee under a proprietary license. If you didn't
       want to make a royalty payment to Qt's developers (and/or are concerned about
       potential future payments and/or how that might empower one company in the
       future), you could choose to use the Gtk+ library.
       On the other hand, if you're determined to illegally violate the licenses,
       then do not make the unwise presumption that you won't get caught. Since
       FLOSS source code is widely available, it turns out that it's often easy to
       determine if a product has stolen code, and people do actually do such
       analysis. [673]One developer quickly found and proved that "CherryOS" had
       blatantly stolen PearPC Code. [674]Netfilter developers have had many
       successes in enforcing their licenses against people who sell black-box
       routers and wireless access points with stolen code. The site
       [675]GPL-violations.org has the [676]goal to resolve GPL violations, amicably
       where possible, and the [677]Free Software Foundation (FSF)'s Compliance lab
       handles investigation of alleged violations of the GPL and LGPL and
       subsequent enforcement when violations are confirmed. Besides being sued by
       an original developer (for stealing their work), you also won't be able to
       sue others if they steal your work, due to legal doctrine called "unclean
       hands"; If someone has stolen something from you, but you stole to acquire it
       in the first place, courts will tend to throw you out.
       The bottom line: if you intend to reuse someone else's software in your own,
       you must always examine the license first before incorporating it into your
       system (to make sure its requirements are compatible with yours). This is
       true whether the code is proprietary or FLOSS. Development organizations
       normally have a process for evaluating licenses, so the task of evaluating an
       FLOSS license is just more of the same work they already have to do. If
       you're developing proprietary code, just make sure that your developers are
       legally obligated to go through a vetting process before reusing external
       code (this is standard practice in the industry). FLOSS licenses generally
       require that the license accompany the code it covers, so it's quite easy to
       get and review any license (it comes with the code you want to use!). If
       there's any doubt, there are search engines you can use to check. But this
       licensing decision is the same sort of decision that must already be made in
       any software development shop: before reusing code, you must ensure that its
       licensing requirements are compatible with your requirements, and that you
       comply with its requirements.
    7. Will unintentionally including GPL code in proprietary code force the rest of
       the product to be GPL'ed? No, though you can choose to do so. The GPL, like
       most licenses for proprietary software libraries, grants you the right to use
       code only under certain conditions. Many proprietary libraries require that
       you pay a fee for each copy, or a large fee for unlimited use. The GPL
       requires no fee, and indeed doesn't include many of the restrictions a
       typical proprietary software license includes. But the GPL does require that,
       if you include the GPL code as part of your code, you need to release the
       rest of the code under the GPL.
       So what happens if you are developing a proprietary product, and one of your
       developers includes GPL code directly into the product without your
       knowledge? Once that happens, you typically have three options: (1) release
       the rest under the GPL, (2) remove the GPL'ed code, or (3) arrange for the
       GPL'ed code to be released to you under a compatible license (this typically
       involves a fee, and some projects will not be willing to do this). This is
       not a good situation to be in; make sure that your developers know that they
       must not steal code from any source, but must instead ensure that the
       licenses of any software they include in your program (either open source
       software or proprietary software) is compatible with your licenses. Note that
       exactly the same thing happens if you incorporate someone else's proprietary
       code in your software, with typically even worse results, because proprietary
       vendors are more likely to sue without working with you and they can often
       show larger direct monetary losses.
       There are many ways of proprietary and GPL programs can work together, but it
       must be carefully done to obey the licenses. The Linux kernel is GPL'ed, but
       proprietary applications can run on top of it (outside the kernel) without
       any limitations at all. The gcc compiler is GPL'ed, but proprietary
       applications can be compiled using it. A GPL program can be invoked by a
       proprietary program, as long as they are clearly separable.
       Indeed, there are a large number of misconceptions about the GPL, more than
       can be covered here. For more information about the GPL, a useful source is
       the [678]Frequently Asked Questions about the GNU GPL from the Free Software
       Foundation (the authors of the GPL).
    8. Is FLOSS economically viable? Yes. There are companies that are making money
       on FLOSS, or using FLOSS to support their money-making activities. Many
       papers have been written about how to make money using FLOSS, such as
       [679]Eric S. Raymond's "The Magic Cauldron" and [680]Donald K. Rosenberg's
       "How to make money with open-source software". The [681]IT Manager's Journal
       article from May 2004 by John C. Koenig describes "Seven open source business
       strategies for competitive advantage" (i.e., seven business strategies using
       open source software). [682]Bruce Perens' "The Emerging Economic Paradigm of
       Open Source" also provides useful insights. FLOSS isn't compatible with some
       business models, but FLOSS is certainly compatible with or supports other
       models. Capitalism does not guarantee that businesses can remain unchanged in
       changing environments.
       For example, [683]HP reported in January 2003 that it had annual sales of $2
       billion linked to GNU/Linux. [684]IBM reported in 2002 that they had already
       made almost all of their $1 billion investment in Linux back in only one year
       - i.e., as profit. [685]James Boyle's response "Give me liberty and give me
       death?" makes the extraordinary observation that "IBM now earns more from
       what it calls `Linux-related revenues' than it does from traditional patent
       licensing, and IBM is the largest patent holder in the world".
       The 2004 article [686]"Firefox fortune hunters" notes that "new businesses
       are cropping up to provide organizations ranging from museums to software
       companies to the U.S. Department of Defense with Mozilla-based applications
       -- for a fee". "Business is pretty crazy right now," said Pete Collins of the
       Mozdev Group, "With the popularity of Firefox and the economy rebounding,
       we've been swamped. We don't even advertise--clients find us and provide us
       with work".
       [687]The Financial Times Story "Could Linux dethrone the software king?" from
       January 21, 2003 analyzes some of the financial issues of FLOSS.
       [688]Joel Spolsky's "Strategy Letter V" notes that "most of the companies
       spending big money to develop open source software are doing it because it's
       a good business strategy for them". His argument is based on microeconomics,
       in particular, that every product in the marketplace has substitutes and
       complements. A substitute is another product you might buy if the first
       product is too costly, while a complement is a product that you usually buy
       together with another product. Since demand for a product increases when the
       prices of its complements decrease, smart companies try to commoditize their
       products' complements. For example, an automobile manufacturer may invest to
       reduce the cost of gas refinement - because if gas is cheaper, they'll sell
       more cars. For many companies, such as computer hardware makers and service
       organizations, supporting an FLOSS product turns a complementary product into
       a commodity - resulting in more sales (and money) for them.
       Although many FLOSS projects originally started with an individual working in
       their spare time, and there are many FLOSS projects which can still be
       described that way, the "major" widely-used projects tend to no longer work
       that way. Instead, most major FLOSS projects have large corporate backing
       with significant funds applied to them. This shift has been noted for years,
       and is discussed in papers such as [689]Brian Elliott Finley's paper
       Corporate Open Source Collaboration?.
       Also, looking only at companies making money from FLOSS misses critical
       issues, because that analysis looks only at the supply side and not the
       demand side. Consumers are saving lots of money and gaining many other
       benefits by using FLOSS, so there is a strong economic basis for its success.
       Anyone who is saving money will fight to keep the savings, and it's often
       cheaper for consumers to work together to pay for small improvements in an
       FLOSS product than to keep paying and re-paying for a proprietary product. A
       proprietary vendor may have trouble competing with a similar FLOSS product,
       because the FLOSS product is probably much cheaper and frees the user from
       control by the vendor. For many, money is still involved - but it's money
       saved, not money directly acquired as profit. Some FLOSS vendors have done
       poorly financially - but many proprietary software vendors (and restaurants!)
       have also done poorly too, and that doesn't mean that FLOSS never works.
       Luckily for consumers, FLOSS products are not tied to a particular vendor's
       financial situation as much as proprietary products are.
       Fundamentally, software is economically different than physical goods; it is
       infinitely replicable, it costs essentially nothing to reproduce, and it can
       be developed by thousands of programmers working together with little
       investment (driving the per-person development costs down to very small
       amounts). It is also durable (in theory, it can be used forever) and nonrival
       (users can use the same software without interfering with each other, a
       situation not true of physical property). Thus, the marginal cost of
       deploying a copy of a software package quickly approaches zero. This explains
       how Microsoft got so rich so quickly (by selling a product that costs nearly
       nothing to replicate), and why many FLOSS developers can afford to give
       software away. See [690]"Open Source-onomics: Examining some pseudo-economic
       arguments about Open Source" by Ganesh Prasad, which counters "several myths
       about the economics of Open Source". [691]People are already experimenting
       with applying FLOSS concepts to other intellectual works, and it isn't known
       how well FLOSS concepts will apply to other fields. [692]Yochai Benkler's
       2002 Yale Law Journal article, "Coase's Penguin, or Linux and the Nature of
       the Firm" argues that FLOSS development is only one example of the broader
       emergence of a new, third mode of production in the digitally networked
       environment called "commons-based peer-production" (to distinguish it from
       the property- and contract-based models of firms and markets). He states that
       its central characteristic is that groups of individuals successfully
       collaborate on large-scale projects following a diverse cluster of
       motivational drives and social signals, rather than either market prices or
       managerial commands. He also argues that this mode has systematic advantages
       over markets and managerial hierarchies when the object of production is
       information or culture, and where the capital investment necessary for
       production (computers and communications capabilities) is widely distributed
       instead of concentrated. These advantages are that (1) it is better at
       identifying and assigning human capital to information and cultural
       production processes (a smaller "information opportunity cost" in assigning
       the best person for a given job), and (2) there are substantial increasing
       returns to allow larger clusters of potential contributors to interact with
       very large clusters of information resources in search of new projects and
       collaboration enterprises (because property and contract constraints have
       been removed). In short, it is clear that making economic decisions based on
       analogies between software and physical objects is not sensible, because
       software has many economic characteristics that are different from physical
       objects.
       One very interesting presentation is [693]Brent C. Williams' Open Source
       Business Models: A Wall Street Look at a Wild 2006 and the Prospects for Even
       More Fun in 2007. He examines Oracle Linux, the Microsoft-Novell deal, and
       Red Hat's stock price, and has lots of interesting insights.
       [694]Who Says You Can't Make Money with Open Source? (30 June 2011) by Jim
       Zemlin (Executive Director of the Linux Foundation) points out differences
       over the ten years from 2001 to 2011. Microsoft, which sometimes works with
       FLOSS but also tries to directly compete with it, has lost value; an investor
       who put $100K in Microsoft ten years ago would now have $69K. Red Hat, whose
       only business is providing service and support for FLOSS, has provided an 8X
       return over the S&P 500, and expects to have a billion dollars in revenue in
       2011. IBM (which builds services and products around FLOSS) has had a 43%
       increase in value over those ten years.
    9. Will FLOSS destroy the software industry? Won't programmers starve if many
       programs become FLOSS? No; increasingly FLOSS is commercially developed and
       supported. It's certainly possible that many FLOSS products will eliminate
       their proprietary competition, but that's the nature of competition. If FLOSS
       approaches pose a significant threat to proprietary development approaches,
       then proprietary vendors must either find ways to compete or join the FLOSS
       movement. No one mourns the loss of buggy whip manufacturers, who were driven
       out of business by a superior approach to transportation (cars). Heinlein
       noted that no one is guaranteed protection against change in Life-Line
       (1939): "There has grown up in the minds of certain groups in this country
       the notion that because a man or a corporation has made a profit out of the
       public for a number of years, the government and the courts are charged with
       the duty of guaranteeing such profit in the future, even in the face of
       changing circumstances and contrary public interest. This strange doctrine is
       not supported by statute nor common law. Neither individuals nor corporations
       have any right to come into court and ask that the clock of history be
       stopped, or turned back, for their private benefit. "
       [695]Eric Raymond's "The Magic Cauldron" describes many ways to make money
       with FLOSS. One particularly interesting note is that there is evidence that
       95% of all software is not developed for sale. For the vast majority of
       software, organizations must pay developers to create it anyway. Thus, even
       if FLOSS eliminated all shrink-wrapped programs, it would only eliminate 5%
       of the existing software development jobs. And, since the FLOSS programs
       would be less expensive, other tasks could employ developers that are
       currently too expensive, so widespread FLOSS development would not harm the
       ability of developers to make a living. [696]The Open Source Initiative has
       an article on why programmers won't starve, and again, [697]Bruce Perens'
       "The Emerging Economic Paradigm of Open Source" also provides useful
       insights.
       FLOSS doesn't require that software developers work for free; many FLOSS
       products are developed or improved by employees (whose job is to do so)
       and/or by contract work (who contract to make specific improvements in FLOSS
       products). If an organization must have a new capability added to an FLOSS
       program, they must find someone to add it... and generally, that will mean
       paying a developer to develop the addition. That person may be internal to
       the organization, someone already involved in the program being modified, or
       a third party. The difference is that, in this model, the cost is paid for
       development of those specific changes to the software, and not for making
       copies of the software. Since copying bits is essentially a zero-cost
       operation today, this means that this model of payment more accurately
       reflects the actual costs (since in software almost all costs are in
       development, not in copying).
       There are several different systems for connecting people willing to pay for
       a change with people who know how to make the change. A common approach is to
       use your own employees to make the change necessary for what you want. But
       there are alternatives. Bounty systems (also called sponsor systems or pledge
       systems) are systems where a user asks for an improvement and states a price
       they're willing to pay for that improvement. Typical bounty systems allow
       others to join in, with the goal of accumulating enough of a bounty to entice
       a developer to implement the improvement. Some bounty systems are run by
       individual projects; others are third-party bounty systems that work like
       independent auction houses, connecting users with third-party developers.
       Many FLOSS projects run their own bounty systems, such as the [698]Mozilla
       projects, the [699]GNOME project, [700]Horde [701]Asterisk, [702]Lime Wire,
       and [703]i2p.
       Bounty systems are sometimes directly supported by the project's bug tracking
       tools. For example, the [704]Mantis bug tracking system includes a
       [705]sponsorship option. Using this option, every time people report a bug or
       feature request, they can include an amount they're willing to pay for it.
       That means that any project that uses the Mantis bug tracker (including
       projects like [706]Plucker) automatically includes a bounty system. I expect
       that more bug/improvement tracking systems will include this capability in
       the future, since it easily integrates into the existing project processes,
       and it supports direct interaction between users and developers. The
       widely-used "Bugzilla" bug/feature tracking system, doesn't include a bounty
       system as of April 2005, but [707]a bounty system for Bugzilla has been
       proposed.
       Some users and governments offer a bounty from their own sites that describe
       what they want; [708]Mark Shuttleworth's bounties are good example.
       [709]Software in the Public Interest funds specific FLOSS projects.
       Organizations that run bounty-like programs for FLOSS include
       [710]opensourcexperts.com's bounty list and the list run by the [711]Public
       Software Fund. There are also more general organizations that support
       third-party bounties or group fund-raising activities, such as
       [712]Ideacradle.com and [713]dropcash.com. Somewhat confusingly, the term
       "security bug bounty system" is often used for the system where anyone who
       reports a security defect is paid a certain amount; [714]Mozilla also has a
       security bug bounty programs.
       Of course, the reverse can happen: Someone can propose to do something, if
       people will raise a certain amount of capital for it. [715]Fundable.org
       supports allowing people to post funding requirements, such as proposing to
       implement a new capability in an FLOSS project for a certain amount of money.
       This approach has been used to support FLOSS; [716]Newsforge describes how
       Fundable.org was used to quickly raise funds so Frederico Caldeira Knabben
       could port his FLOSS html editor to the Apple Macintosh's Safari.
       Another approach, primarily used when trying to transform a proprietary
       software into an FLOSS program (by buying the software from its previous
       owner) has been called a "software ransom"; users pool their money together
       with the purpose of paying the owner to release the (existing) product as
       FLOSS. For example, [717]Blender was released as FLOSS through a software
       ransom (termed the "Free Blender" campaign).
       Indeed, there has been a recent shift in FLOSS away from volunteer
       programmers and towards paid development by experienced developers. Again,
       see [718]Ganesh Prasad's article for more information. [719]Brian Elliott
       Finley's article "Corporate Open Source Collaboration?" stated that "Now
       corporate America is getting involved in the development process. This seems
       to be a common trend amongst individuals, and now corporations, as they move
       into the Open Source world. That is that they start out as a user, but when
       their needs outstrip existing software, they migrate from being mere users to
       being developers. This is a good thing, but it makes for a slightly different
       slant on some of the dynamics of the process". [720]AOL decided to spin off
       the Mozilla project as a separate organization; not only does the separate
       organization employ several full-time employees, but other organizations have
       worked to hire Mozilla workers. Fundamentally, paying software developers is
       similar to paying for proprietary licenses, except you only have to pay for
       improvements (instead of paying for each copy), so many organizations appear
       to have found that it's worthwhile. The [721]Boston Consulting Group/OSDN
       Hacker Survey (January 31, 2002) surveyed users of SourceForge and found that
       33.8% of the FLOSS developers were writing OSS code for "work functionality"
       (i.e., it was something they did as part of their employment). It also
       provided quantitative evidence that FLOSS developers are experienced; it
       found that FLOSS developers had an average age of 30 and that they averaged
       11 years of programming experience.
       In 2004, [722]Government Computer News reported in July 2004 on a
       presentation by Andrew Morton, who leads maintenance of the the Linux kernel
       in its stable form, and confirmed the trend towards paid FLOSS developers.
       Morton spoke at a meeting sponsored by the Forum on Technology and
       Innovation, to address technology-related issues, held by Sen. John Ensign
       (R-Nev.), Sen. Ron Wyden (D- Ore.) and the Council on Competitiveness. Morton
       noted that "People's stereotype [of the typical Linux developer] is of a male
       computer geek working in his basement writing code in his spare time, purely
       for the love of his craft. Such people were a significant force up until
       about five years ago ..". but contributions from such enthusiasts, "is
       waning... Instead, most Linux kernel code is now generated by corporate
       programmers". Morton noted that "About 1,000 developers contribute changes to
       Linux on a regular basis... Of those 1,000 developers, about 100 are paid to
       work on Linux by their employers. And those 100 have contributed about 37,000
       of the last 38,000 changes made to the operating system". The article later
       notes "Even though anyone can submit changes, rarely does good code come from
       just anyone. Morton noted that it is rare that a significant change would be
       submitted from someone who is completely unknown to the core developers. And
       all submitted code is inspected by other members of the group, so it is
       unlikely some malicious function may be secretly embedded in Linux... Far
       from being a project with a vast numbers of contributors, about half of those
       37,000 changes are made by core developer team of about 20 individuals,
       Morton said".
       This trend has continued. The [723]Linux Foundation's December 2010 report
       "Linux Kernel Development: How Fast it is Going, Who is Doing It, What They
       are Doing, and Who is Sponsoring It" (by Jonathan Corbet, Greg KroahHartman,
       and Amanda McPherson) found that 70% of the Linux kernel developers are
       provably being paid to do this development (and I believe the real figure is
       much higher).
       The September 3, 2004 article [724]Peace, love and paychecks gives one of
       many examples of this trend. Network Appliance (NetApp) pays significant
       money to one of the Linux lieutenants (Myklebust), as well as developing code
       for Linux, for a very simple reason: money. "What's in it for [NetApp] is
       sales; it can sell into the Linux market. This is not about philanthropy.
       There is plenty of mutual benefit going on here," says Peter Honeyman. The
       article notes that "Big companies pick up the tab for Linux development
       because the system helps them sell hardware and consulting services. HP
       claims $2.5 billion in Linux-related revenue in 2003, while IBM claims $2
       billion. Red Hat, which distributes a version of the Linux operating system,
       generated $125 million in revenues last fiscal year and carries a market
       value of $2.3 billion. Last year sales of Linux servers grew 48% to $3.3
       billion, and by 2008 Linux server sales could approach $10 billion, according
       to market researcher IDC". NetApp earned $152 million on sales of $1.2
       billion, its Linux payoff is significant. Linux now contains bits of code
       written by NetApp's programmers, so that NetApp works particularly well with
       Linux. As a result, "it has won business it wouldn't have otherwise at
       Oracle, Pixar, Southwest Airlines, ConocoPhillips and Weta Digital, the
       effects studio behind Lord of the Rings". For fast-moving projects like the
       Linux kernel, the entire development process is supportive of developers of
       kernel improvements and drivers who contribute to the codebase... and not to
       those who try to rig the system and make proprietary kernel drivers
       (proprietary applications are fine). One person noted, [725]"the kernel
       developers all like how this [development process] is working. No stable
       internal-kernel [application programmer interface], never going to happen,
       get used to it (syscalls won't break)". Drivers outside of the official Linux
       kernel tree will typically become useless almost immediately; thus,
       developers must get their device drivers released as FLOSS and into the main
       kernel immediately, or the development process will rush away from you.
       Proprietary components are treated as if they don't exist, and if you don't
       support the community, people generally don't care. Organizations who try to
       acquire ownership of the kernel through licensing games quickly discover that
       their efforts are discarded.
       [726]BusinessWeek ran a lengthy cover story on January 2005 called "Linux
       Inc". which described the whole GNU/Linux development process, and related
       business models, in detail.
       There are many other examples of this transition to paid-for FLOSS
       development. For example, [727]Nokia funded Matthew Allum to rapidly improve
       the Matchbox Window Manager (to support Nokia's N770 Internet Tablet).
       Walt Scacchi, a research scientist at the University of California at
       Irvine's Institute for Software Research, studies the FLOSS, and found that
       salaries are 5-15% higher for core contributors to popular FLOSS projects.
       The article [728]Firefox fortune hunters quotes Scacchi, who explained that
       "These people are in demand... software developers who are identified as core
       contributors [to popular FLOSS projects] are likely to have market
       opportunities that conventional software developers would not have. If you've
       contributed to a software system used by millions of people, you've
       demonstrated something that most software developers have not done".
       The Investors.com article [729]"Open-Source Success Roiling Software Field"
       by Ken Spencer Brown (Sep. 21, 2005) discusses the continued and increasing
       commercialization of FLOSS. The article reports that "For every
       multimillion-dollar software program being sold, there's a good chance that
       at least one free alternative can do the same thing, at a fraction of the
       cost". He describes the transition this way: "In many ways, open source and
       its best-known product, Linux, have shed their counterculture roots and gone
       pro. Most of Linux's current programmers work for companies that want a say
       in the software's development". He also notes that today, many FLOSS products
       are developed by venture-capital based companies, "and venture capitalists
       all want their money back -- and them some". The reason is simple: "Contrary
       to popular belief, most business users insist on using paid-for versions of
       open-source software or add-on support". He quotes James Thomas, product
       marketing director at Business Objects, as saying, "Open source is a business
       model, and people forget that". Brown notes that "almost every large public
       tech company has embraced open source to one degree or another".
       [730]Robert Westervelt reported in SearchVB (a resource specializing in
       Microsoft's Visual Basic!) reported that security, web services and Linux
       jobs continue to dominate the IT help wanted ads in 2004, and are projected
       to remain among the hottest skill and certification areas in 2005. Tony Iams,
       principal analyst with D.H. Brown Associates Inc., said that "Linux for a
       long time had been targeted for edge of network type applications, but it's
       taking on support for a much broader range of applications... For a while, it
       looked like the future was Windows, but now there is a larger demand for a
       more hands-on understanding for the Unix and Linux philosophy of managing
       workloads". The [731]Free Software Foundation (FSF)'s Jobs in Free Software
       page is one of many places where companies and potential employees can find
       each other to work on FLOSS projects, but it certainly not the only such
       place.
       [732]Nathan Eddy's "Report: Open Source Adoption Increases App Dev Pay"
       (2008) quotes consulting company Bluewolf. Michael Kirven, co-founder and
       principle, says that "There's been a huge wave of people embracing open
       source technologies... [its availability] has far outstripped the people
       trained for them". They found that developers with a specialization in those
       technologies are in a position to ask for a 30 or 40 percent pay increase...
       LAMP is everywhere now --... [and are] becoming a hot commodity".
       Corporate support of FLOSS projects is not a new phenomenon. [733]The X
       window system began in 1984 as a cooperative effort between MIT and Digital
       Equipment Corporation (DEC), and by 1988 a non-profit vendor consortium had
       been established to support it. The Apache web server [734]began in 1995,
       based on previous NCSA work. In other words, both X and Apache were developed
       and maintained by consortias of companies from their very beginning. Other
       popular FLOSS projects like MySQL, Zope, and Qt have had strong backing from
       a specific commercial company for years. But now there is more corporate
       acceptance in using FLOSS processes to gain results, and more understanding
       of how to do so. And as more FLOSS projects gain in maturity, it is more
       likely that some project will intersect with a given company's needs.
       It seems unlikely that so many developers would choose to support an approach
       that would destroy their own industry, and there are a large number of FLOSS
       developers. On January 28, 2003, Sourceforge.net all by itself reported that
       it had 555,314 registered users on its FLOSS development site, and many of
       the largest FLOSS projects are not hosted by Sourceforge.net (including the
       Linux kernel, the gcc compilation system, the X-Windows GUI system, the
       Apache web server, the Mozilla web browser, and the OpenOffice.org document
       management suite). Unfortunately, there seems to be no data to determine the
       number of FLOSS developers worldwide, but it is likely to be at least a
       million people and possibly many, many more.
       FLOSS enables inexperienced developers to gain experience and credibility,
       while enabling organizations to find the developers they need (and will then
       pay to develop more software). Often organizations will find the developers
       they need by looking at the FLOSS projects they depend on (or on related
       projects). Thus, lead developers of an FLOSS project are more likely to be
       hired by organizations when those organizations need an extension or support
       for that project's program. This gives both hope and incentive to
       inexperienced developers; if they start a new project, or visibly contribute
       to a project, they're more likely to be hired to do additional work. Other
       developers can more easily evaluate that developer's work (since the code is
       available for all to see), and the inexperienced developer gains experience
       by interacting with other developers. This isn't just speculation; one of
       Netscape's presenters at FOSDEM 2002 was originally a volunteer contributor
       to Netscape's Mozilla project; his contributions led Netscape to offer him a
       job (which he accepted).
       Of course, FLOSS certainly has an impact on the software industry, but in
       many ways it appears quite positive, especially for customers. Since
       customers are the ones directly funding the specific improvements they
       actually want (using money and/or developer time), market forces push FLOSS
       developers directly towards making the improvements users actually want.
       Proprietary vendors try to identify customer needs using marketing
       departments, but there's little evidence that marketing departments are as
       effective as customers themselves at identifying customer needs. In FLOSS
       development, customers demonstrate which capabilities are most important to
       them, directly, by determining what they'll fund. Another contrast is that
       proprietary developers' funding motivations are not always aligned with
       customers' motivations. Proprietary development has strong financial
       incentives to prevent the use of competing products, to prevent
       inter-operation with competing products, and to prevent access to copies
       (unless specifically authorized by the vendor). Thus, once a proprietary
       product becomes widely used, its vendor sometimes devotes increasing efforts
       to prevent use, inter-operation, and copying, instead of improving
       capabilities actually desired by customers and even if those mechanisms
       interfere with customer needs. This trend is obvious over the decades of the
       software industry; dongles, undocumented and constantly changing data
       protocols and data formats, copy-protected media, and software registration
       mechanisms which interfere with customer needs are all symptoms of this
       difference in motivation. Note that an FLOSS developer loses nothing if their
       customer later switches to a competing product (whether FLOSS or
       proprietary), so an FLOSS developer has no incentive to insert such
       mechanisms.
       And many companies have been created to exploit FLOSS. No doubt many will
       fail, just like many restaurants fail, but those who succeed should do well.
       [735]The Star Tribune notes that starting a software company used to be hard
       work -- now people take FLOSS products, combine them to solve specific
       problems, and sell them (with support) at a large profit.
       Karen Shaeffer has written an interesting piece, [736]Prospering in the Open
       Source Software Era, which discusses what she views to be the effects of
       FLOSS. For example, FLOSS has the disruptive effect of commoditizing what
       used to be proprietary property and it invites innovation (as compared to
       proprietary software which constrained creativity). She thinks the big
       winners will be end users and the software developers, because "the value of
       software no longer resides in the code base - it resides in the developers
       who can quickly adapt and extend the existing open source code to enable
       businesses to realize their objectives concerned with emerging opportunities.
       This commoditization of source code represents a quantum step forward in
       business process efficiency - bringing the developers with the expertise into
       the business groups who have the innovating ideas".
       The article [737]"Zen and the Art of the Six-Figure Linux Salary" (September
       30, 2008) notes that upper-level Linux jobs are not only lucrative, but
       becoming more so, due to demand.
   10. Is FLOSS compatible with Capitalism? Yes. Years ago some tried to label FLOSS
       as "communistic" or "socialistic" (i.e., anti-capitalist), but that rhetoric
       has failed. One article explaining why FLOSS and capitalism are compatible is
       Ganesh Prasad's [738]How Does the Capitalist View Open Source?. This paper
       shows that FLOSS is quite consistent with capitalism: it increases wealth
       without violating principles of property ownership or free will. The
       developer of the Linux kernel, [739]Linus Torvalds, noted that U.S. copyright
       law specifically notes the exchange of copyrighted material as financial
       gain. US Code, Title 17 (copyrights), Chapter 1, Section 101: "Definitions"
       says that, "The term `financial gain' includes receipt, or expectation of
       receipt, of anything of value, including the receipt of other copyrighted
       works". He notes that this is part of the very fundamentals of copyright law.
       What's more, he notes that the GPL license (the most popular FLOSS license)
       "is designed so that people receive the value of other people's copyrighted
       works in return [for] their own contributions. That is the fundamental idea
       of the whole license - everything else is just legal fluff... the notion that
       the GPL has, of `exchange of receipt of copyrighted works,' is actually
       explicitly encoded in U.S. copyright law. It's not just a crazy idea that
       some lefty Commie hippie dreamed up..". See also the information on
       [740]economic viability and [741]starving programmers for more.
   11. If only FLOSS programs exist in a software category, will that completely
       eliminate competition? No. Oddly enough, FLOSS programs sometimes compete
       with each other in a given functional area. The text editors emacs (primarily
       GNU emacs) and vi (primarily vim) have dueled for decades. Sendmail is still
       a popular program for delivering email, but it has competition from other
       FLOSS programs such as Postfix and Exim. The desktop environments GNOME and
       KDE compete with each other, as do the OS kernels of Linux and the BSDs.
       Generally, competing FLOSS projects must distinguish themselves from each
       other to succeed (e.g., through user interface philosophies, design
       approaches, characteristics like security, licensing strategies, and so on),
       but of course that's true for competing proprietary programs too. Also,
       competing FLOSS programs generally try to stay compatible with each other
       (because their customers demand it) and sometimes even help each other with
       technical problems. For example, [742]freedesktop.org provides a forum to
       encourage cooperation among open source desktops for the X Window System
       (such as KDE and GNOME), and is part of the [743]Free Standards Group which
       tries to accelerate the use and acceptance of open source technologies
       through the development, application and promotion of standards. In addition,
       even if there is one product, multiple organizations can compete for
       maintenance and support (e.g., GNU/Linux distributors do this). Thus, even if
       FLOSS eliminates all proprietary programs in a given category, that would
       still not eliminate competition.
   12. Are FLOSS program compatible with standards? Yes.
       FLOSS programs can implement standards, just like proprietary programs can.
       FLOSS often implement relevant standards better than proprietary products.
       The reason is simple: FLOSS projects have no financial incentive to ignore or
       subvert a standard. A proprietary software maker's duty is to maximize
       profits. Proprietary makes may choose to do this by ignoring standards or
       creating proprietary extensions to standards; once customers depend on these
       proprietary interfaces, they will find it very difficult to switch to a
       different product, even if it's better. In contrast, FLOSS projects are
       generally supported directly by their users, who want to employ standards to
       maintain access to their data, simplify inter-operation with others, and
       simplify integration into their own environments.
       I have sometimes noted that FLOSS projects often end up creating executable
       specifications or executable standards. Traditional (paper) standards cannot
       be directly used by users, and always include ambiguities that are difficult
       to resolve later. In contrast, FLOSS programs can be used directly be users
       -- thus they help users more directly -- yet because their implementations
       are transparent, they can clarify any ambiguities in the documented
       standards. As FLOSS has grown, various bodies have worked to develop
       standards to support interoperability. This includes [744]the Free Standards
       Group, [745]Free Desktop.org, [746]Linux Standard Base, the [747]Filesystem
       Hierarchy Standard, and [748]X.org. There is also a great deal of interaction
       with standards-making groups such as the [749]IETF and the [750]W3C. See also
       my discussion on [751]single source solutions.
   13. Is FLOSS a "destroyer of intellectual property"? No. It's true that
       [752]Microsoft's Jim Allchin has claimed that OSS is an intellectual-property
       destroyer and that it's somehow "un-American". But you can use FLOSS products
       (e.g., a word processor) to develop private and proprietary information, and
       you can keep the information as confidential and proprietary as you want.
       What you can't do is use someone else's material in a way forbidden by law...
       and this is true for all software, not just FLOSS.
       One interesting case is the "General Public License" (GPL), the most common
       FLOSS license. Software covered by the GPL can be modified, and the modified
       code can be used in house without obligations. If you release that modified
       software, you must include an offer for the source code under the same GPL
       license. Basically, the GPL creates a consortium; anyone can use and modify
       the program, but anyone who releases the program (modified or not) must
       satisfy the restrictions in the GPL that prevent the program and its
       derivatives from becoming proprietary. Since the GPL is a legal document, it
       can be hard for some to understand. Here is one less legal summary
       ([753]posted on Slashdot):

     This software contains the intellectual property of several people.
     Intellectual property is a valuable resource, and you cannot expect to be able
     to use someone else's intellectual property in your own work for free. Many
     businesses and individuals are willing to trade their intellectual property in
     exchange for something of value; usually money. For example, in return for a
     sum of money, you might be granted the right to incorporate code from
     someone's software program into your own.

     The developers of this software are willing to trade you the right to use
     their intellectual property in exchange for something of value. However,
     instead of money, the developers are willing to trade you the right to freely
     incorporate their code into your software in exchange for the right to freely
     incorporate your code [which incorporates their code] into theirs. This
     exchange is to be done by way of and under the terms of the GPL. If you do not
     think that this is a fair bargain, you are free to decline and to develop your
     own code or purchase it from someone else. You will still be allowed to use
     the software, which is awfully nice of the developers, since you probably
     didn't pay them a penny for it in the first place.
       Microsoft complains that the GPL does not allow them to take such code and
       make changes that it can keep proprietary, but this is hypocritical.
       Microsoft doesn't normally allow others to make and distribute changes to
       Microsoft software at all, so the GPL grants far more rights to customers
       than Microsoft does.
       In some cases Microsoft will release source code under its "shared source"
       license, but that license (which is not FLOSS) is far more restrictive. For
       example, it prohibits distributing software in source or object form for
       commercial purposes under any circumstances. Examining Microsoft's shared
       source license also shows that it has even more stringent restrictions on
       intellectual property rights. For example, it states that "if you sue anyone
       over patents that you think may apply to the Software for a person's use of
       the Software, your license to the Software ends automatically," and "the
       patent rights Microsoft is licensing only apply to the Software, not to any
       derivatives you make". [754]A longer analysis of this license and the
       problems it causes developers is provided by Bernhard Rosenkraenzer (bero).
       The FSF has also posted a press release on why they believe the [755]GPL
       protects software freedoms.
       It's true that organizations that modify and release GPL'ed software must
       yield any patent and copyright rights for those additions they release, but
       such organizations do so voluntarily (no one can force anyone to modify GPL
       code) and with full knowledge (all GPL'ed software comes with a license
       clearly stating this). And such grants only apply to those modifications;
       organizations can hold other unrelated rights if they wish to do so, or
       develop their own software instead. Since organizations can't make such
       changes at all to proprietary software in most circumstances, and generally
       can't redistribute changes in the few cases where they can make changes, this
       is a fair exchange, and organizations get far more rights with the GPL than
       with proprietary licenses (including the "shared source" license). If
       organizations don't like the GPL license, they can always create their own
       code, which was the only option even before GPL'ed code became available.
       Although the GPL is sometimes called a "virus" by proprietary vendors
       (particularly by Microsoft) due to the way it encourages others to also use
       the GPL license, it's only fair to note that many proprietary products and
       licenses also have virus-like effects. Many proprietary products with
       proprietary data formats or protocols have "network effects," that is, once
       many users begin to use that product, that group puts others who don't use
       the same product at a disadvantage. For example, once some users pick a
       particular product such as a proprietary OS or word processor, it becomes
       increasingly difficult for other users to use a different product. Over time
       this enforced use of a particular proprietary product also spreads like a
       virus.
       Certainly many technologists and companies don't think that the GPL will
       destroy their businesses. Many seem too busy mocking Microsoft's claims
       instead (for an example, see [756]John Lettice's June 2001 article " Gates:
       GPL will eat your economy, but BSD's cool"). After all, [757]Microsoft sells
       a product with GPL'ed components, and still manages to hold intellectual
       property (see below).
       Perhaps Microsoft means the GPL "destroys" intellectual property because the
       owners of competing software may be driven out of business. If so, this is
       hypocritical; Microsoft has driven many companies out of business, or bought
       them up at fractions of their original price. Indeed, sometimes the
       techniques that Microsoft used have later been proven in court to be illegal.
       In contrast, there is excellent evidence that [758]the GPL is on very solid
       legal ground. "Destruction" of one organization by another through legal
       competition is quite normal in capitalistic economies.
       The GPL does not "destroy" intellectual property; instead, it creates a level
       playing field where people can contribute improvements voluntarily to a
       common project without having them "stolen" by others. You could think of the
       GPL as creating a consortium; no one is required to aid the consortium, but
       those who do must play by its rules. The various motivations for joining the
       consortium vary considerably (see the article [759]License to FUD), but
       that's true for any other consortium too. It's understandable that Microsoft
       would want to take this consortium's results and take sole ownership of
       derivative works, but there's no reason to believe that a world where the GPL
       cannot be used is really in consumers' best interests.
       The argument is even more specious for non-GPL'ed code. Microsoft at one time
       protested about open source software, but indeed, they are a key user of open
       source software; key portions of Microsoft Windows (including much of their
       Internet interfacing software) and Microsoft Office (such as compression
       routines) include open source software. In 2004, [760]Microsoft released an
       installation tool, WiX, as open source software on SourceForge. Indeed,
       [761]the release of WiX as FLOSS appears to be quite a success; after 328
       days on SourceForge, the WiX project has on the order of 120,000 downloads,
       and about two-thirds of the bugs logged have been fixed. Stephen R. Walli,
       formerly of Microsoft, reports that there's a core of half a dozen developers
       working predominantly on their own time (so Microsoft doesn't have to pay
       them). Yet Windows development customers are "happy and directly involved in
       the conversation with Microsoft employees. One stunning submission came from
       a developer that built a considerable tutorial on WiX. I did a quick page
       estimate and it looks like this developer gave the WiX project at least a
       month of his life".
       [762]Microsoft now actively develops and maintains several FLOSS projects,
       including Windows Installer XML (WiX), Windows Template Library (WTL)
       project, and FlexWiki. Jason Matusow, director of Microsoft's shared-source
       program, said the company "will expand its open-source programs over time,
       but is moving slowly as it tries to learn how to participate in open-source
       communities".
   14. Is there really a lot of FLOSS software? Yes. Freshmeat.net counts over
       21,000 software branches of FLOSS software as of October 2002.
       Sourceforge.net hosts 55,424 FLOSS projects all by itself (as of January 28,
       2003). [763]The dmoz list of just OS counts 114 FLOSS OSes; this includes old
       systems (re-enabling their support), experiments, and specialized projects.
       There's little reason to believe that this counts all FLOSS software, but it
       certainly indicates there's a large amount of it. These projects vary in
       value and quality, of course, just as proprietary programs do, but all of
       these FLOSS projects can be the basis of future work.
   15. Is having the ability to view and change source code really
       valuable/important for many people? Surprisingly, yes. It's certainly true
       that few people need direct access to source code; only developers or code
       reviewers need the ability to access and change code. But not having access
       to how your computer is controlled is still a significant problem. Bob Young
       of Red Hat uses the analogy of [764]having your car's hood welded shut to
       explain why even non-technical users need access to the source code. Here is
       his explanation, in his own words:

     Open source gives the user the benefit of control over the technology the user
     is investing in... The best analogy that illustrates this benefit is with the
     way we buy cars. Just ask the question, "Would you buy a car with the hood
     welded shut?" and we all answer an emphatic "No". So ask the follow-up
     question, "What do you know about modern internal-combustion engines?" and the
     answer for most of us is, "Not much".

     We demand the ability to open the hood of our cars because it gives us, the
     consumer, control over the product we've bought and takes it away from the
     vendor. We can take the car back to the dealer; if he does a good job, doesn't
     overcharge us and adds the features we need, we may keep taking it back to
     that dealer. But if he overcharges us, won't fix the problem we are having or
     refuses to install that musical horn we always wanted -- well, there are
     10,000 other car-repair companies that would be happy to have our business.
     In the proprietary software business, the customer has no control over the
     technology he is building his business around. If his vendor overcharges him,
     refuses to fix the bug that causes his system to crash or chooses not to
     introduce the feature that the customer needs, the customer has no choice.
     This lack of control results in high cost, low reliability and lots of
     frustration.
       To developers, source code is critical. Source code isn't necessary to break
       the security of most systems, but to really fix problems or add new features
       it's quite difficult without it. Microsoft's Bill Gates has often claimed
       that most developers don't need access to OS source code, but [765]Graham
       Lea's article "Bill Gates' roots in the trashcans of history" exposes that
       Gates actually extracted OS source code himself from other companies by
       digging through their trash cans. Mr. Gates said, "I'd skip out on athletics
       and go down to this computer center. We were moving ahead very rapidly:
       Basic, FORTRAN, LISP, PDP-10 machine language, digging out the OS listings
       from the trash and studying those". If source code access isn't needed by
       developers, why did he need it? Obviously, there's a significant advantage to
       developers if they can review the source code, particularly of critical
       components such as an operating system.
       See also the discussion on the [766]greater flexibility of FLOSS.
   16. Is FLOSS really just an anti-Microsoft campaign? No. Certainly there are
       people who support FLOSS who are also against Microsoft, but it'd be a
       mistake to view FLOSS as simply anti-Microsoft.
       Microsoft already depends on FLOSS software in its own applications; Windows'
       implementation of the basic Internet protocols (TCP/IP) was derived from
       FLOSS code, and its Office suite depends on the FLOSS compression library
       "zlib".. More recently, Microsoft admitted that critical infrastructure for
       its [767]forthcoming Windows Server 2003 Compute Cluster Edition will be
       FLOSS; as the eWeek article notes, it would have been difficult for them to
       compete otherwise. Microsoft could, at any time, release programs such as its
       OSes as FLOSS, take an existing FLOSS OS and release it, or provide
       applications for FLOSS systems. There is no licensing agreement that prevents
       this (though it certainly would be radically different than their current
       business processes, so no one expects this to happen any time soon). Indeed,
       FLOSS leaders often note that they are not against Microsoft per se, just
       some of its current business practices, and many have repeatedly asked
       Microsoft to join them (e.g., see [768]Free Software Leaders Stand Together).
       In many cases FLOSS is developed with and for Microsoft technology. On June
       21, 2002, [769]SourceForge listed 831 projects that use Visual Basic (a
       Microsoft proprietary technology) and 241 using C# (a language that
       originated from Microsoft). [770]A whopping 8867 projects are listed as
       working in Windows. This strongly suggests that there are many FLOSS
       developers who are not "anti-Microsoft".
       Microsoft has said that it's primarily opposed to the GPL, but Microsoft
       sells a product with GPL'ed components. [771]Microsoft's Windows Services for
       Unix includes Interix, an environment which can run UNIX-based applications
       and scripts on the Window NT and Windows 2000 OSes. There's nothing wrong
       with this; clearly, there are a lot of Unix applications, and since Microsoft
       wants to sell its OSes, Microsoft decided to sell a way to run Unix
       applications on its own products. But many of the components of Interix are
       covered by the GPL, such as gcc and g++ (for compiling C and C++ programs).
       (Microsoft seems to keep moving information about this; [772]here is a stable
       copy). The problem is not what Microsoft is doing; as far as I can tell,
       they're following both the letter and the spirit of the law in this product.
       The problem is that Microsoft says no one should use the GPL, and that no one
       can make money using the GPL, while simultaneously making money using the
       GPL. Bradley Kuhn (of the FSF) bluntly said, "It's hypocritical for them to
       benefit from GPL software and criticize it at the same time". Microsoft
       executives are certainly aware of this use of the GPL; Microsoft Senior Vice
       President Craig Mundie specifically acknowledged this use of GPL software
       when he was questioned on it. Kelly McNeill noted this dichotomy between
       claims and actions in the June 22, 2001 story [773]"Microsoft Exposed with
       GPL'd Software!" [774]A more detailed description about this use of the GPL
       by Microsoft is given in The Standard on June 27, 2001. Perhaps in the future
       Microsoft will try to remove many of these GPL'ed components so that this
       embarrassing state of affairs won't continue. But even if these components
       are removed in the future, this doesn't change the fact that Microsoft has
       managed to sell products that include GPL-covered code without losing any of
       its own intellectual property rights.
       In more recent years, Microsoft has begun to take a much more enlightened
       stand on FLOSS. Bill Hilf has explained, "Do I really care if it's open
       source or not if it sells our infrastructure?" (This is from the article
       "Cracking Open the Door to Open Source" by Carolyn A. April, Redmond
       Magazine, March 2007, pp. 26-36; this quote is on page 28.) Indeed, Microsoft
       has been increasingly encouraging FLOSS projects, even in some cases products
       that compete with Microsoft products. Why? Because use of those products
       encourages the sale of other Microsoft products. Hilf reports, with great
       candor, "Some people think that we're doing these deals to appear more
       `friendly', and that's not it at all. It's all about growing our business".
       This includes projects like WiX, IronPython, and even the entire
       Microsoft-sponsored CodePlex website for developing FLOSS programs.
       It's also worth noting the [775]Microsoft - Novell deal. Brett Smith (Free
       Software Foundation Licensing Engineer) states that in this deal, "Microsoft
       provides coupons for SUSE to companies, who then go to Novell to redeem the
       coupons and get their copy of the software. Those coupons procure the
       conveyance of lots of free software... Microsoft is already conveying GPLed
       software under this agreement".
       That being said, there are certainly many people who are encouraging specific
       FLOSS products (such as Linux) so that there will be a viable competition to
       Microsoft, or who are using the existence of a competitor to obtain the best
       deal from Microsoft for their organization. This is nothing unusual -
       customers want to have competition for their business, and they usually have
       it in most other areas of business. Certainly there is a thriving competing
       market for computer hardware, which has resulted in many advantages for
       customers. [776]The New York Times' position is that "More than two dozen
       countries - including Germany and China - have begun to encourage
       governmental agencies to use such "open source" software ... Government units
       abroad and in the United States and individual computer users should look for
       ways to support Linux and Linux-based products. The competition it offers
       helps everyone".
   17. I've always assumed there's no free lunch; isn't there some catch? If there
       is an FLOSS product that meets your needs, there really isn't a catch.
       Perhaps the only catch is misunderstanding the term "free". The GPL includes
       this (haiku) text: "When we speak of free software, we are referring to
       freedom, not price". I.E., FLOSS is not necessarily cost-free. In practice,
       it's still often a bargain.
       Naturally, if you want services besides the software itself (such as
       guaranteed support, training, and so on), you must pay for those things just
       like you would for proprietary software. If you want to affect the future
       direction of the software - especially if you must have the software changed
       in some way to fit it to your needs - then you must invest to create those
       specific modifications. Typically these investments involve hiring someone to
       make those changes, possibly sharing the cost with others who also need the
       change. Note that you only need to pay to change the software - you don't
       need to pay for permission to use the software, or a per-copy fee, only the
       actual cost of the changes.
       For example, when IBM wanted to join the Apache group, IBM discovered there
       really was no mechanism to pay in money. IBM soon realized that the primary
       "currency" in FLOSS is software code, so IBM turned the money into code and
       all turned out very well.
       This also leads to interesting effects that explains why many FLOSS projects
       start small for years, then suddenly leap into a mode where they have a
       rapidly increasing functionality and user size. For any application, there is
       a minimum level of acceptable functionality; below this, there will be very
       few users. If that minimum level is large enough, this creates an effect
       similar to an "energy barrier" in physics; the barrier can be large enough
       that most users are not willing to pay for the initial development of the
       project. However, at some point, someone may decide to begin the "hopeless"
       project anyway. The initial work may take a while, because the initial work
       is large and there are few who will help. However, once a minimum level of
       functionality is reached, a few users will start to use it, and a few of them
       may be willing to help (e.g., because they want the project to succeed or
       because they have specialized needs). At some point in this growth, it is
       like passing an energy barrier; the process begins to become self-sustaining
       and exponentially increasing. As the functionality increases, the number of
       potential users begins to increase rapidly, until suddenly the project is
       sufficiently usable for many users. A percentage of the userbase will decide
       to add new features, and as the userbase grows, so do the number of
       developers. As this repeats, there is an explosion in the program's
       capabilities.

                        10. FLOSS on the desktop: Client computing

   FLOSS programs have been competing for many years in the server market, and are
   now well-established in that market. FLOSS programs have been competing for
   several years in the embedded markets, and have already begun to significantly
   penetrate those markets as well.

   In contrast, FLOSS programs currently have only a small client (desktop and
   laptop) market share. This is unsurprising; FLOSS only began to become viable for
   client computing in 2002, and it takes time for any software to mature, be
   evaluated, and be deployed. Since FLOSS is a brand new contender in the client
   market, it has only begun penetrating into that market. However, FLOSS use on
   client systems has grown significantly, and there are reasons to think that will
   grow even more significantly in the future.

   A few definitions are necessary first, before examining the issue in more depth.
   Many users' only direct experience with computers is through their desktop or
   laptop computers running "basic client applications" such as a web browser, email
   reader, word processor, spreadsheet, and presentation software (the last three
   together are often called an "office suite"), possibly with additional client
   applications, and all of these must have a graphical user interface and be
   supported by an underlying graphical environment. Such computers are often called
   "client" computers (even if they are not using the technical approach called the
   "client-server model"). Another term also used is the "desktop", even if the
   computer is not on a desk.

   The small FLOSS desktop market share should not be surprising, because viable
   FLOSS client applications only became available in 2002. As a practical matter,
   client systems must be compatible with the market leader, for example, the office
   suite must be able to read and write documents in the Microsoft Office formats.
   Before 2002 the available FLOSS products could not do this well, and thus were
   unsuitable for most circumstances. Clearly, FLOSS client applications cannot be
   considered unless they are already available.

   One point less understood is that FLOSS operating systems (like GNU/Linux) could
   not really compete with proprietary operating systems on the client until FLOSS
   (and not proprietary) basic client applications and environment were available.
   Clearly, few users can even consider buying a client system without basic client
   applications, since that system won't meet their fundamental requirements. There
   have been proprietary basic client applications for GNU/Linux for several years,
   but they didn't really make GNU/Linux viable for client applications. The reason
   is that a GNU/Linux system combined with proprietary basic client applications
   still lacks the freedoms and low cost of purely FLOSS systems, and the
   combination of GNU/Linux plus proprietary client applications has to compete with
   established proprietary systems which have many more applications available to
   them. This doesn't mean that GNU/Linux can't support proprietary programs;
   certainly some people will buy proprietary basic client applications, and many
   people have already decided to buy many other kinds of proprietary applications
   and run them on a GNU/Linux system. However, few will find that a GNU/Linux
   system with proprietary basic client applications has an advantage over its
   competition. After all, the result is still proprietary, and since there are
   fewer desktop applications of any kind on GNU/Linux, many capabilities have been
   lost, little has been gained, and the switching costs will dwarf those minute
   gains. There is also the problem of transition. Many organizations will find it
   too traumatic to immediately switch all client systems to an FLOSS operating
   system; it is often much easier to slowly switch to FLOSS basic client
   applications on the pre-existing proprietary operating system, and then switch
   operating systems once users are familiar with the basic client applications.
   Thus, the recent availability of FLOSS basic client applications has suddenly
   made FLOSS operating systems (like GNU/Linux) far more viable on the client.

   First, let's look at the available market share figures. According to [777]the
   June 2000 IDC survey of 1999 licenses for client machines, GNU/Linux had 80% as
   many client shipments in 1999 as Apple's MacOS (5.0% for Mac OS, 4.1% for
   GNU/Linux). More recent figures in 2002 suggest that GNU/Linux has [778]1.7% of
   the client OS market. Clearly, the market share is small at this early stage.
   Obviously, while this shows that there are many users (because there are so many
   client systems), this is still small compared to [779]Microsoft's effective
   monopoly on the client OS market. [780]IDC reported that Windows systems (when
   they are all combined) accounted for 92% of the client operating systems sold.

   However, there are many factors that suggest that the situation is changing:
   FLOSS basic client software is now available, there's increasing evidence of
   their effectiveness, Microsoft is raising prices, and organizations (including
   governments) want open systems:
    1. FLOSS basic client software is available. Back in 1997 I forecast that
       GNU/Linux would be "ready for the desktop" in 2002-2003 (5 years later). My
       forecast appears correct; FLOSS applications and environments matured in 2002
       where they are finally functionally competitive on the client. In 2002,
       Mozilla finally released version 1.0 of their suite (including a web browser,
       email reader, and other tools), and the first reasonably usable version of
       OpenOffice.org, the first practically useful FLOSS office suite, was released
       in 2002 as well. Desktop environments matured as well; in 2002 both the GNOME
       and KDE projects released capable, more mature versions of their desktop
       environments. In addition the WINE product (a product that allows FLOSS
       systems to run Windows programs) was finally able to run Microsoft Office 97,
       suggesting that although WINE is still immature, it may be sufficient to run
       some Windows applications developed internally by some organizations. In
       2005, the Firefox web browser and Thunderbird email programs were released as
       a next generation of the Mozilla work.
       There are other plausible alternatives for client applications as well, such
       as Evolution (an excellent mail reader), Abiword (a lighter-weight but less
       capable word processor which also released its version 1.0 in 2002), Gnumeric
       (a spreadsheet), and KOffice (an office suite).
       However, I will emphasize OpenOffice.org, Firefox, and Thunderbird, for two
       reasons. First, they also run on Microsoft Windows, which makes it much it
       easier to transition users from competitors (this enables users to migrate a
       step at a time, instead of making one massive change). Second, they are
       full-featured, including compatibility with Microsoft's products; many users
       want to use fully-featured products since they don't want to switch programs
       just to get a certain feature. In short, it looks like there are now several
       FLOSS products that have begun to rival their proprietary competitors in both
       usability and in the functionality that people need, including some very
       capable programs.
    2. There is increasing evidence of FLOSS client software effectiveness. The
       [781]MOXIE study of January 2003 randomly acquired 100 documents from the
       Internet in the Microsoft Office word processor, spreadsheet, and
       presentation software formats. Their leading FLOSS contender, OpenOffice.org
       version 1.0.1, did well; it was able to successfully use 97%, 98%, and 94% of
       the documents (of the respective formats). The study concluded that "the
       current state of interoperability is reasonably good, although there is
       significant room for improvement". Since that time, the OpenOffice.org
       developers have specifically worked to improve interoperability with
       Microsoft Office, and it's reasonable to expect that the figures are
       significantly higher now.
    3. Microsoft has raised its prices. Microsoft is changing many of its practices,
       resulting in increasing costs to its customers. It has changed its licensing
       so that one copy of Windows cannot be used for both home and office.
       Microsoft has switched its largest customers to a subscription-based approach
       (called "Licensing 6"), greatly increasing the costs to its customers.
       [782]TIC/Sunbelt Software Microsoft Licensing Survey Results (covering March
       2002) reports the impact on customers of this new licensing scheme. 80% had a
       negative view of the new licensing scheme, noting, for example, that the new
       costs for software assurance (25% of list for server and 29% of list for
       clients) are the highest in the industry. Of those who had done a cost
       analysis, an overwhelming 90% say their costs will increase if they migrate
       to 6.0, and 76% said their costs would increase from 20% to 300% from what
       they are paying now under their current 4.0 and 5.0 Microsoft Licensing
       plans. Indeed, 38% of those surveyed said that they are actively seeking
       alternatives to Microsoft products. [783]Licensing 6.0 can also significantly
       harm organizations trying to sell off a part of its operations. The program
       requires accelerated software maintenance payments when the computers that
       are covered under the license are sold off - but Microsoft is no longer
       obligated to provide maintenance even if the contract is fully paid.
       [784]Gartner's review of Star Office (Sun's variant of OpenOffice.org) also
       noted that Microsoft's recent licensing policies may accelerate moving away
       from Microsoft. As Gartner notes, "This [new license program] has engendered
       a lot of resentment among Microsoft's customers, and Gartner has experienced
       a marked increase in the number of clients inquiring about alternatives to
       Microsoft's Office suite... enterprises are realizing that the majority of
       their users are consumers or light producers of information, and that these
       users do not require all of the advanced features of each new version of
       Office... unless Microsoft makes significant concessions in its new office
       licensing policies, Sun's StarOffice will gain at least 10 percent market
       share at the expense of Microsoft Office by year-end 2004 (0.6 probability)".
       They also note that "Because of these licensing policies, by year-end 2003,
       more than 50 percent of enterprises will have an official strategy that mixes
       versions of office automation products - i.e., between multiple Microsoft
       Office versions or vendor products (0.7 probability)".
    4. Organizations (including governments) want open systems. Organizations,
       including governments, do not want to be locked into products and services
       from a single vendor. Multiple vendors mean competition between suppliers,
       generally driving down costs and increasing quality. See the [785]separate
       section on governments and FLOSS.

   There are some interesting hints that GNU/Linux is already starting to gain on
   the client. Some organizations, such as [786]TrustCommerce and the [787]city of
   Largo, Florida, report that they've successfully transitioned to using Linux on
   the desktop.

   Many organizations have found a number of useful processes for making this
   transition practical. Many start by replacing applications (and not the operating
   system underneath) with FLOSS replacements. For example, they might switch to
   Mozilla as a web browser and email reader, OpenOffice.org for an office suite.
   Organizations can also move their infrastructure to web-based solutions that
   don't care about the client operating system. Eventually, they can start
   replacing operating systems (typically to a GNU/Linux distribution), but still
   using various mechanisms to run Microsoft Windows applications on them.
   [788]Various products allow users to run Microsoft Windows applications on
   GNU/Linux, including Windows application servers, Wine, win4lin, VMWare, and so
   on.

   There's already some evidence that others anticipate this; [789]Richard Thwaite,
   director of IT for Ford Europe, stated in 2001 that an open source desktop is
   their goal, and that they expect the industry to eventually go there (he controls
   33,000 desktops, so this would not be a trivial move). It could be argued that
   this is just a ploy for negotiation with Microsoft - but such ploys only work if
   they're credible.

   There are other sources of information on FLOSS or GNU/Linux for clients.
   [790]Desktoplinux.com is a web site devoted to the use of GNU/Linux on the
   desktop; they state that "We believe Linux is ready now for widespread use as a
   desktop OS, and we have created this website to help spread the word and
   accelerate the transition to a more open desktop, one that offers greater freedom
   and choice for both personal and business users".

   Bart Decrem's [791]Desktop Linux Technology & Market Overview, funded by Mitch
   Kapor, gives a detailed analysis and prognostication of GNU/Linux on the desktop.
   [792]Paul Murphy discusses transitioning large companies to Linux and Intel
   ("Lintel") on the desktop, and concludes that one of the biggest risks is trying
   to copy a Windows architecture instead of exploiting the different capabilities
   GNU/Linux offers.

   Indeed, it appears that many users are considering such a transition. [793]ZDNet
   published survey results on August 22, 2002, which asked "Would your company
   switch its desktop PCs from Windows to Linux if Windows apps could run on Linux?"
   Of the more than 15,000 respondents, 58% said they'd switch immediately; another
   25% said they'd consider dumping Windows in favor of Linux within a year. While
   all such surveys must be taken with a grain of salt, still, these are not the
   kind of responses you would see from users happy with their current situation.
   They also noted that ZDNet Australia found that 55% of the surveyed IT managers
   were considering switching from Microsoft products. Most people do not expect
   that this transition, if it happens, will happen quickly: it is difficult to
   change that many systems. But the fact that it's being considered at all is very
   intriguing. A number of opinion pieces, such as [794]Charlie Demerjian's "The IT
   industry is shifting away from Microsoft" argue that a major IT industry shift
   toward FLOSS is already occurring, across the board.

   [795]Many analysts believe Microsoft has extended Windows 98 support because it's
   worried that Windows 98 users might switch to GNU/Linux.

   There are certainly challenges for FLOSS desktops, even though they are
   successfully used and deployed right now. Software patents (an extremely
   controversial practice) makes it illegal in some countries to create or use FLOSS
   implementations of some functions, particularly audio or video display.
   [796]Digital Rights Management, aka [797]Digital Restrictions Management (DRM),
   also make it difficult to deploy FLOSS implementations. [798]World Domination 201
   discusses some of the challenges to FLOSS on the desktop. Nevertheless, many
   people are deploying and using FLOSS desktops.

                                    11. Usage Reports

   There are many reports from various users who have switched to FLOSS; here are a
   sample that you may find useful. This is not an exhaustive list, nor can it be.

   As discussed earlier, [799]the City of Largo, Florida supports 900 city employees
   using GNU/Linux, saving about $1 million a year. A [800]BusinessWeek online
   article notes that Mindbridge shifted their 300-employee intranet software
   company from Microsoft server products and Sun Solaris to GNU/Linux; after
   experiencing a few minor glitches, their Chief Operating Officer and founder
   Scott Testa says they now couldn't be happier, and summarizes that "...we're
   saving hundreds of thousands of dollars between support contracts, upgrade
   contracts, and hardware". [801]Amazon.com saved millions of dollars by switching
   to GNU/Linux. Oracle's Chairman and CEO, Larry Ellison, said that [802]Oracle
   will switch to GNU/Linux to run the bulk of its business applications no later
   than summer 2002, replacing three Unix servers. [803]A travel application service
   provider saved $170,000 in software costs during the first six months of using
   GNU/Linux (for both servers and the desktop); it also saved on hardware and
   reported that administration is cheaper too. [804]CRN's Test Center found that a
   GNU/Linux-based network (with a server and 5 workstations) cost 93% less in
   software than a Windows-based network, and found it to be quite capable. The
   article [805]Linux as a Replacement for Windows 2000 determined that "Red Hat
   Linux 7.1 can be used as an alternative to Windows 2000... You will be stunned by
   the bang for the buck that Linux bundled free `open source' software offers".

   [806]Windows to Linux: a corporate success story describe the experience of
   Amauta, a small Ecuadorian start-up focused on Web applications and network
   service integration.

   [807]"Running only on Open Source Software" is a 2008 article on according to
   Smartleaf of Cambridge, Mass. They're a small company (around 25 people), but
   they manage more than $6 billion in others' assets (their product is a financial
   account management system). According to Ritter, Smartleaf uses open-source
   software rather than purchasing "closed" proprietary software wherever possible.
   They said, "Building a Microsoft-based infrastructure to replicate what we do
   with open source would double the cost of new hires' desktops, and spending a lot
   more each year in various licensing costs", and since Microsoft Windows'
   automation tools are poor, using Windows would cause their administration costs
   to deoubl. They believe "open-source software is more reliable and secure and
   maintained," though they have not collected the quantitative data to support
   that.

   Educational organizations have found FLOSS software useful. The [808]K12 Linux
   Terminal Server Project has set up many computer labs in the U.S. Northwest in
   elementary, middle, and high schools. For example, [809]St. Mary's School is a
   450-student Pre-K through 8th grade school in Rockledge, Florida that applying
   GNU/Linux using their approach. Their examples show that kids don't find
   GNU/Linux that hard to use and quite able to support educational goals. For
   example, third graders put together simple web pages about their favorite Saints
   using a variety of FLOSS programs: they logged into GNU/Linux systems, typed the
   initial content using Mozilla Composer (an FLOSS web page editor), drew pictures
   of the Saints using The Gimp (an FLOSS drawing program), and shared the results
   with Windows users using Samba. The page [810]Why should open source software be
   used in schools? gives various examples of educational organizations who have
   used FLOSS programs, as well as linking to various general documents on why
   educational organizations should use FLOSS. The [811]letter from the Kochi Free
   Software Users' Group to the Government of Kerala and others also summarizes some
   of the issues, especially why governments should specify standards (and not
   products) for educational use. The Faculty Senate of the University at Buffalo,
   State University of New York, approved [812]a resolution strongly supporting the
   use of FLOSS instead of proprietary software. The Northwest Educational
   Technology Consortium has an interest set of information on FLOSS on its website,
   in the section [813]Making Decisions About Open Source Software (OSS) for K-12.

   Many financial organizations use FLOSS. In 2005, [814]Industrial and Commercial
   Bank of China (ICBC), China's biggest bank, signed an agreement with Turbolinux
   to integrate Linux across its banking network; this follows a September 2004
   announcement by the Agricultural Bank of China (ABC) that it would be moving to
   Linux thin-client terminals based on an optimized Red Hat Linux distribution.
   [815]The Chicago Mercantile Exchange credits its migration to commodity
   Intel-based servers and Linux with cutting costs and reducing a critical 100
   milliseconds off the time required to complete each trade. [816]Online brokerage
   E*Trade is moving its computer systems to IBM servers running GNU/Linux, citing
   cost savings and performance as reasons for switching to GNU/Linux (the same
   article also notes that clothing retailer L.L. Bean and financial services giant
   Salomon Smith Barney are switching to GNU/Linux as well). [817]Merrill Lynch is
   switching to GNU/Linux company-wide, and are hoping to save tens of millions of
   dollars annually within three to five years. [818]Adam Wiggins reports on
   TrustCommerce's successful transition to Linux on the desktop. [819]An April 22,
   2002 report on ZDNet, titled "More foreign banks switching to Linux", stated that
   New Zealand's TSB bank "has become the latest institution to adopt the
   open-source Linux OS. According to reports, the bank is to move all its branches
   to the Linux platform... in Europe, BP and Banca Commerciale Italiana feature
   among the big companies that have moved to Linux. According to IBM, as many as 15
   banks in central London are running Linux clusters". They also mentioned that
   "Korean Air, which now does all its ticketing on Linux, and [motor home]
   manufacturer Winnebago, are high-profile examples". [820]The Federal Aviation Air
   Traffic Control System Command Center in Herndon, Virginia is currently
   installing a system to support 2,000 concurrent users on Red Hat Linux. The
   system, known as the National Log, will act as a central clearinghouse database
   for users in air traffic centers across the country. [821]ComputerWorld reported
   in October 2002 an increasing use of GNU/Linux on Wall Street - Merrill Lynch
   reports that a majority of new projects are interested in GNU/Linux, for example,
   and the article references a TowerGroup (of Needham, MA) estimate that GNU/Linux
   is currently deployed on 7% of all servers in North American brokerage firms.
   TowerGroup also forecasts that GNU/Linux use will grow at an annual rate of 22%
   in the securities server market between 2002 and 2005, outpacing growth in
   Windows 2000, NT and Unix deployments.

   Some organizations are deploying GNU/Linux widely at the point of sale. Many
   retailer cash registers are switching to GNU/Linux, according to Information Week
   ("Cash Registers are Ringing up Sales with Linux" by Dan Orzech, December 4,
   2000, Issue 815); on September 26, 2002, [822]The Economist noted that "Linux is
   fast catching on among retailers". According to Bob Young (founder of Red Hat),
   [823]BP (the petroleum company) is putting 3,000 Linux servers at gas stations.
   [824]Zumiez is installing open-source software on the PCs at all its retail
   locations, and expects that this will cut its technology budget between $250,000
   and $500,000 a year; note that this includes using Evolution for email, Mozilla
   for web browsing (to eliminate the need for printed brochures and training
   manuals), and an open source spreadsheet program. [825]Sherwin-Williams, the
   number one U.S. paint maker, plans to convert its computers and cash registers
   (not including back office support systems) in over 2,500 stores to GNU/Linux and
   has hired IBM to do the job; this effort involves 9,700 NetVista desktop personal
   computers,

   FLOSS is also prominent in Hollywood. Back in 1996, when GNU/Linux was considered
   by some to be a risk, [826]Digital Domain used GNU/Linux to generate many images
   in Titanic. After that, it burst into prominence as many others began using it,
   so much so that a [827]February 2002 article in IEEE Computer stated that "it is
   making rapid progress toward becoming the dominant OS in ... motion pictures".
   "Shrek" and "Lord of the Rings" used GNU/Linux to power their server farms, and
   now [828]DreamWorks SKG has switched to using GNU/Linux exclusively on both the
   front and back ends for rendering its movies. [829]Industrial Light & Magic
   converted its workstations and render farm to Linux in 2001 while it was working
   on Star Wars Episode II. They stated that "We thought converting to Linux would
   be a lot harder than it was" (from their SGI IRIX machines). They also found that
   the Linux systems are 5 times faster than their old machines, enabling them to
   produce much higher quality results. They also use Python extensively (an FLOSS
   language), as well as a number of in-house and proprietary tools. [830]Disney is
   also shifting to GNU/Linux for film animation.

   Many remote imaging systems use GNU/Linux. When a remote imaging system was
   placed at the North Pole, reporters noted that the Linux mascot was a penguin and
   announced that [831]Penguins invade the North Pole.

   There are many large-scale systems. [832]In October 2002, Chrysler Group
   announced it's using a Linux cluster computer for crash simulation testing and
   analysis in an effort to make safer cars and trucks. Their configuration uses 108
   workstations, each with 2 processors, so the system uses 216 computers all
   running Red Hat Linux, and expect to improve simulation performance by 20% while
   saving about 40% in costs.

   FLOSS is widely used by Internet-based companies. [833]Google uses over 6,000
   GNU/Linux servers. [834]Yahoo! is increasing its already-massive use of FLOSS.
   Yahoo claims it is the "World's most trafficked Internet destination," justified
   based on Nielsen/NetRatings of August 2002. Yahoo had 201 million unique users,
   93 million active registered users, over 4500 servers, and over 1.5 billion page
   views a day. Yahoo noted that FLOSS already runs their business (e.g., Perl,
   Apache, FreeBSD, and gcc), and they've recently decided to move from their
   proprietary in-house languages to PHP (an FLOSS language). [835]Afilias has
   switched the registration database for the .org Internet domain from the
   proprietary Oracle to the FLOSS PostgreSQL database program; .org is the fifth
   largest top-level domain, with more than 2.4 million registered domain names.

   [836]Bloor Research announced in November 2002 that they believe GNU/Linux is
   ready to support large enterprise applications (i.e., it's "enterprise ready").
   They reached this conclusion after examining its scalability, availability,
   reliability, security, manageability, flexibility, and server consolidation
   characteristics, They concluded that "Linux now scales well on Intel hardware,
   and by taking advantage of [fail-over] extensions from Linux distributors and
   Grid suppliers, high availability can be achieved. Linux is proven to be
   reliable, especially for dedicated applications, and its open source nature
   ensures that it is at least as secure as its rivals". Only 3 years earlier Bloor
   had said GNU/Linux wasn't ready.

   [837]Librarians have also found many advantages to FLOSS.

   One interesting usage story is the story of [838]James Burgett's Alameda County
   Computer Resource Center, one of the largest non-profit computer recycling
   centers in the United States. Its plant processes 200 tons of equipment a month
   in its 38,000-square-foot warehouse. It has given thousands of refurbished
   computers to disadvantaged people all over the world, including as human rights
   organizations in Guatemala, the hard-up Russian space program, schools, and
   orphanages. All of the machines have GNU/Linux installed on them.

   Indeed, for well-established products like GNU/Linux, very strong cases can be
   made for considering them. On October 18, 2002, [839]Forrester Research reported
   that "Linux is now ready for prime time". They stated that "CIOs have many new
   reasons to be confident that they'll get quality Linux support from their largest
   application vendors and systems integrators," referencing Amazon, Oracle, Sun,
   and IBM, among others who have made commitments that increase confidence that
   GNU/Linux is ready for deployment.

   Indeed, these uses are becoming so widespread that [840]Microsoft admits that
   FLOSS competition may force Microsoft to lower its prices, at least in the server
   market. Microsoft noted this in its 10-Q quarterly filing, stating that "To the
   extent the open source model gains increasing market acceptance, sales of the
   company's products may decline, the company may have to reduce the prices it
   charges for its products, and revenues and operating margins may consequently
   decline".

   Summaries of government use in various countries are available from
   [841]Infoworld and [842]IDG.

   Several organizations collect reports of FLOSS use, and these might be useful
   sources for more information. [843]Linux International has a set of [844]Linux
   case studies/success stories. Mandriva maintains [845]a site recording the
   experiences of business users of the Mandrake distribution. [846]Red Hat provides
   some similar information. Opensource.org includes some [847]case studies.

   The Dravis Group LLC published in April 2003 [848]Open Source Software: Case
   Studies Examining its Use, examining several specific use cases in depth. Their
   study of several different organizations deploying FLOSS concluded the following:
    1. Cost is a significant factor driving adoption of open source software.
    2. Control and flexibility are considered benefits as well.
    3. Implementation of open solutions is evolutionary, not revolutionary.
    4. Open source extends across the entire software stack.
    5. Product support is not a significant concern.
    6. Open source is not a magic solution.
    7. Open standards may be more important than open source.

                                12. Governments and FLOSS

   Practically all governments use FLOSS extensively, some develop FLOSS as well,
   and many have policies or are considering policies related to FLOSS. Motivations
   vary; for many governments, the overriding rationale for considering FLOSS is
   simply to reduce costs. Such governments will still take a variety of other
   factors into account such as reliability, performance, and so on, just like a
   commercial firm would do. Some governments may also consider the special
   privileges granted to them by FLOSS; e.g., there are direct advantages to users
   if they can examine the source code, modify the software to suit them, or
   redistribute the software at will.

   In contrast, some governments also consider FLOSS as a way of supporting other
   national policies. Here is a list of some of the other considerations that have
   been reported by various governments:
    1. supporting industrial policy -- a government may choose to support FLOSS to
       encourage the development of local companies who can train, support, and
       tailor products. Proprietary software products can only be maintained by a
       single company, who are often foreign and do not allow the best software jobs
       to be performed in that country (and only that company can maintain the
       product).
    2. increasing competition / reducing dependence on, or control by, any one
       company -- a government may wish to prevent any one company from completely
       controlling the computing infrastructure of the government or its country;
       this is especially of concern to many if that one company is foreign. This is
       not necessarily the same as supporting industrial policy; the goal may be to
       simply support improved competition, foreign or not. After all, the effective
       monopolies in various software markets can be viewed as a market failure that
       requires correction. Lawrence Lessig's [849]Code and Other Laws of Cyberspace
       argues that "code is law" -- as computers become increasingly embedded in our
       world, what their code does, allows, and prohibits controls what we may or
       may not do in very a powerful way. "If code is law, who are the lawmakers?
       What values are being embedded in the code? Both questions are fundamentally
       about sovereignty. Who should be building this [electronic] world, and who
       should be specifying the values that this world will build into itself?"
       Governments may be increasingly skeptical of a world where increasingly its
       laws are rendered irrelevant by the controls of code from a single company.
       Proprietary vendors can also threaten governments into doing what they want,
       by [850]threatening to withhold the product from that country -- are only
       possible because of the monopoly powers granted to proprietary vendors. They
       can also [851]threaten to pull out of countries, a threat again made stronger
       because of their monopoly power.
    3. security -- many are concerned about the security of software they depend on.
       Often proprietary products are bought and later found to be full of security
       vulnerabilities; FLOSS products at least provide governments with the option
       of detailed review of the source code, and to fix problems themselves without
       waiting for the vendor. Microsoft does have a "shared source" program for its
       operating system, allowing governments to look at source code, but this
       program does not generally permit the worldwide analysis and discussion that
       FLOSS permits, nor does it permit changes and redistribution by end-users. In
       many cases, the proprietary vendor is foreign or has foreign developers,
       which for some governments raises additional concerns -- can the foreign
       company's product be trusted? For example, a special key in [852]Microsoft
       Windows called NSAKEY was identified years ago, and whether or not this was a
       "back door" into Windows, it did reveal that this was a concern of many
       governments. Obviously, any developer can make a mistake leading to a
       security flaw, and a malicious developer could write subversive FLOSS
       software as well. But many believe the additional (worldwide) transparency
       provided by FLOSS, and the ability to repair and redistribute FLOSS programs
       immediately, provides additional protection.
    4. record longevity -- FLOSS reveals exactly how data is stored, so that
       important data is not lost. Governments using proprietary data formats risk
       loss of critical records if the company folds or stops supporting a
       particular format, stops supporting a particular version of a format.
       Unfortunately, this occurs distressingly often.
    5. transparency of government data -- FLOSS enables complete review of exactly
       what is done and what data is stored, so that the public can freely receive
       that data without being required to buy products from any particular vendor.
    6. localization -- FLOSS can also be trivially localized, a critical advantage
       where there are languages with a smaller number of speakers. With FLOSS,
       users do not need to convince a vendor to support their language, they can
       simply add that capability themselves. [853]An interview about OpenOffice.org
       discusses some of these points.

   For example, the United States federal government has a policy of neutrality;
   they choose proprietary or FLOSS programs simply considering costs and other
   traditional measures. In contrast, [854]Dr. Edgar David Villanueva Nuñez (a
   Peruvian Congressman) has written a detailed letter explaining in detail he
   believes it is beneficial (and necessary) for the Peruvian government to prefer
   FLOSS; his list of rationale was "Free access to public information by the
   citizen, permanence of public data, and security of the state and citizens"
   (which are the rationales of transparency, record longevity, and security above).

   The [855]Center for Strategic and International Studies has developed detailed
   analysis of FLOSS policies worldwide, including their [856]2004 survey of the
   FLOSS positions of various governments worldwide. The Open Source and Industry
   Alliance (OSAIA)'s [857]"Roundup of Selected OSS Legislative Activity WorldWide"
   (aka Policy Tracker) surveys government OSS policies in 2003 and 2004. The
   widely-cited [858]Free/Libre and Open Source Software (FLOSS): Survey and Study
   includes a great deal of information about public sector use of FLOSS. [859]An
   older but broad survey was published in 2001 by CNet. More information about
   governments and FLOSS can be found at the [860]Center of Open Source and
   Government (eGovOS) web site. The [861]The Norwegian Board of Technology (an
   independent public think tank) has a global country watch on Open Source policy.
   The 2002 [862]Brookings Institute's "Government Policy toward Open Source
   Software" has a collection of essays about government and FLOSS. [863]Tom
   Adelstein's July 2005 article argues that "Major governments outside the United
   States either have adopted Linux and open-source software or have begun the
   process that will lead to adoption"; it includes several statistics and examples.

   [864]Robin Bloor's January 2005 article noted that many countries now have a
   stated policy of a preference for FLOSS; countries where this is the case, in
   some areas of government IT use, include Bahrain, Belgium, China and Hong Kong,
   Costa Rica, France, Germany, Iceland, Israel, Italy, Malaysia, Poland, Portugal,
   Philippines and South Africa. He also noted that nearly all "governments have R&D
   projects which are investigating the practicality of Open Source for government
   use which will, in all probability lead to local policy guidelines at some point
   which favour open source". [865]A 2002 New York Times article noted that "More
   than two dozen countries in Asia, Europe and Latin America, including China and
   Germany, are now encouraging their government agencies to use `open source'
   software". Robert Kramer of CompTIA (Computer Technology Industry Association)
   says that [866]political leaders everywhere from California to Zambia are
   considering legislating a preference for Open Source software use; he counted at
   least 70 active proposals for software procurement policies that prefer FLOSS in
   24 countries as of October 2002. There are certainly debates on the value of
   FLOSS preferences (even a few FLOSS advocates like Bruce Perens don't support
   mandating a government preference for FLOSS), but clearly this demonstrates
   significant positive interest in FLOSS from various governments.

   [867]Tony Stanco's presentation "On Open Source Procurement Policies" briefly
   describes why he believes governments should consider FLOSS. [868]Ralph Nader's
   Consumer Project on Technology gives reasons he believes the U.S. government
   should encourage FLOSS. The paper [869]Linux Adoption in the Public Sector: An
   Economic Analysis by Hal R. Varian and Carl Shapiro (University of California,
   Berkeley; 1 December 2003) makes several interesting points about FLOSS. This
   paper uses some odd terminology, for example, it uses the term "commercial
   software" where it means "closed source software" (this poor terminology choice
   makes the paper's discussion on commercial open source software unnecessarily
   difficult to understand). But once its terminology is understood, it makes some
   interesting points. It notes that:
    1. "The Linux operating system has achieved a `critical mass' sufficient to
       assure users that it will be available and improved for years to come,
       reducing the risk to users and to software developers.
    2. ... users adopting Linux are less likely to face "lock-in" than those
       adopting proprietary platform software, and they retain greater control over
       their own computing environments. These benefits are especially salient in
       complex computing environments ... as often occurs in the public sector.
    3. Open source software, such as Linux, typically uses open interfaces [that]
       typically lead to a larger, more robust, and more innovative industry and
       therefore software with open interfaces should be preferred by public sector
       officials, as long as it offers comparable quality to proprietary
       alternatives.
    4. Because Linux is open source platform software, adoption of Linux can help
       spur the development of a country's software sector, in part by promoting the
       training of programmers that enables them to develop applications that run on
       the Linux platform. The adoption of the Linux platform may well promote the
       economic development of commercial software to run in that environment.
    5. Fears that the licensing terms associated with Linux discourage the
       development of commercial software are misplaced... we expect mixed computing
       environments involving open source software and commercial software, that
       employ both open and proprietary interfaces, to flourish in the years ahead.

   Governments can also approach FLOSS differently for different circumstances.
   Governments need software to perform their own tasks, of course. Many governments
   are trying to increase the availability of computers (to reduce the "digital
   divide"), and many see FLOSS as a useful way to help do that (e.g., [870]Walter
   Bender, director of MIT's Media Lab, has recommended that Brazil install FLOSS on
   thousands of computers that will be sold to the poor, and not proprietary
   software; "Free software is far better on the dimensions of cost, power and
   quality".). And governments sometimes wish to influence their internal commercial
   markets to improve their competitiveness. [871]Many militaries are applying
   FLOSS, for a variety of reasons.

   Governmental organizations that choose to switch to FLOSS products can find a
   variety of documents to aid them. [872]Tom Adelstein has a short article on how
   to employ FLOSS inside governments (dated January 2005). The [873]International
   Open Source Network (IOSN) has a great deal of information about FLOSS, and aids
   developing countries in the Asia-Pacific region in applying FLOSS; they've
   produced documents such as [874]FOSS education primer. IOSN is an initiative of
   the United Nations (UN) Development Programme's (UNDP) Asia Pacific Development
   Information Programme (APDIP), and is supported by the International Development
   Research Centre (IDRC) of Canada. [875]The Interchange of Data between
   Administrations (IDA) Open Source Migration Guidelines (November 2003) and German
   [876]KBSt's Open Source Migration Guide (July 2003) have useful information about
   such migrations (though both are slightly dated, for example, some of the
   limitations they note have since been resolved).

   It's also worth noting that there's a resurging interest by governments to
   require the use of standards for data storage and data protocols that can be
   implemented by anyone, without any discrimination against an implementor. This
   desire is often not connected to FLOSS, and predates the rise of FLOSS in the
   marketplace. After all, governments have had a strong interest in
   non-discriminatory standards for decades, simply to prudently conduct business.
   For example, on [877]on June 27, 2005, Morten Andreas Meyer, the Norwegian
   Minister of Modernization, announced at a press conference in Oslo that
   "Proprietary formats will no longer be acceptable n communication between
   citizens and government". [878]Massachusetts' Eric Kriss noted that what the
   state really wants is "open formats", by which they mean "specifications for data
   file formats that are based on an underlying Open Standard developed by an open
   community and affirmed by a standards body or de facto format standards
   controlled by other entities that are fully documented and available for public
   use under perpetual, royalty free, and nondiscriminatory terms". As they note,
   governments need to be able to access records 300 years later, and the risk of
   data loss if they use a proprietary format is very great. But such government
   goals do dovetail nicely with the use of FLOSS programs; FLOSS programs can
   implement open standards far more easily than they can implement any secret
   pre-existing formats, and FLOSS source code aids in documenting a format.

   Many countries favor or are considering favoring FLOSS in some way, such as
   [879]Peru, [880]the UK, and [881]Taiwan. [882]In Venezuela, presidential decree
   3,390 establishes that all systems of the public administration should
   preferentially use FLOSS (libre software); the Ministry of Science and Technology
   must give the Presidency plans and programs to support this. [883](see an English
   translation)

   A key issue in most governments is localization (e.g., to a particular locale's
   language and other conventions). In proprietary systems, the vendor must
   typically be convinced to support a given local, or the program will simply be
   unavailable. In FLOSS, anyone can perform the localization -- and typically such
   efforts are achieved by many working together. However, there must be an effort
   to do localization, and in some locales some basic steps must be taken first
   (e.g., there must be agreed-on words for computer terms). The [884]United Nations
   Development Programme-Asia Pacific Development's "Free/Open Source Software:
   Localization" gives a "broad perspective on the localization of Free/Open Source
   Software (FOSS) for the benefit of policy- and decision-makers in developing
   countries," and includes several case studies.

   The following sections describe some government actions in the United States,
   Europe, and elsewhere. There is also a section on some attempts or perceived
   attempts to prevent government consideration of FLOSS. However, this information
   is by no means complete; this is simply a sample of some of the ongoing
   activities.

12.1 United States

   There are many government users of FLOSS in the United States, and a variety of
   related policies, studies, and recommendations. This includes departments and
   agencies of the federal government, as well as state and local governments. Many
   have advocated additional use or changes in approach. A summary of some of this
   information is below.

12.1 U.S. - General use and development

   First, let's discuss the general use and development of FLOSS in U.S.
   governments.

   The U.S. federal government has a formal policy of neutrality, that is, FLOSS and
   proprietary software must be considered using the same criteria, as noted in
   [885]Office of Management and Budget (OMB) memorandum M-04-16 "Software
   Acquisition" of July 1, 2004. This mirrors the earlier [886]2003 FLOSS policy of
   the U.S. Department of Defense, which clearly states that FLOSS and proprietary
   are both acceptable but must follow the same rules. Both also note that the
   license requirements for FLOSS are different than proprietary software, so
   acquirers should make sure they understand the license requirements since they
   may be different from what they're used to. The United States' [887]Federal
   Enterprise Architecture includes the Technical Reference Model (TRM), and TRM
   version 1.1 (August 2003) includes both Linux and Apache.

   The (U.S.) President's Information Technology Advisory Committee (PITAC)'s
   report, the [888]Recommendations of the Panel on Open Source Software For High
   End Computing, recommends that the U.S. "Federal government should encourage the
   development of open source software as an alternate path for software development
   for high end computing". See the separate discussion on [889]MITRE Corporation's
   business case study of OSS (which emphasized use by the U.S. government,
   especially the U.S. military).

   [890]A NASA technical report describes in detail an approach for NASA to release
   some of its software as open source software.

   The U.S. National Imagery and Mapping Agency (NIMA) National Technical Alliance,
   through the National Center for Applied Technology (NCAT) consortium, funded the
   Open Source Prototype Research (OSPR) project. Under the OSPR project ImageLinks
   Inc., Tybrin Inc., Kodak Inc., and Florida Institute of Technology (Florida Tech)
   performed evaluations of open source software development practices and
   demonstrated the technological advantages of Open Source Software. The [891]OSPR
   final report includes those evaluations, a survey, and various related documents;
   these are actually rather extensive. The final report concludes:

     Open Source Software development is a paradigm shift and has enormous
     potential for addressing government needs. Substantial technology leverage and
     cost savings can be achieved with this approach. The primary challenge will be
     in establishing an organizational structure that is able to employ OSS
     methodology...

   Often, some government organization has to build some software to help implement
   a regulation, and it only makes sense to share that software (instead of every
   other organization paying to rebuild it). Making the software FLOSS simplifies
   this kind of sharing. The [892]Government Open Code Collaborative (GOCC) is a
   "voluntary collaboration between public sector entities and non-profit academic
   institutions. The Collaborative was created for the purpose of encouraging the
   sharing, at no cost, of computer code, developed for and by government entities
   where the redistribution of this code is allowed. Government entities, defined as
   a federal, state or local government, an authority or other sub-national public
   sector entity of the United States, can join the GOCC as Members". Another
   government project, the [893]Component Organization and Registration Environment
   (CORE), is a "government source for business process and technical components.
   CORE.GOV is the place to search for and locate a specific component that meets
   your needs, or to find components you can customize to meet your unique
   requirements". The [894]EUROPA - IDABC project has a similar role in Europe.

   [895]A 2007 survey found that open source software was gaining traction in the
   U.S. government. "More than half of all U.S. government executives have rolled
   out open-source software at their agencies, and 71 percent believe their agency
   can benefit from open-source software... Fifty-five percent of respondents said
   their agencies have been involved or are currently involved in an open-source
   implementation". Other results were that "29 percent of respondents who haven't
   adopted open-source software plan to do so in the next six to 12 months", "88
   percent of those in intelligence agencies said that their agencies can benefit
   from open source", and "Ninety percent of the respondents who have implemented
   open-source software said they believe their agency benefits". To be fair, this
   survey of 218 IT decision-makers in the U.S. government was commissioned by the
   [896]Federal Open Source Alliance, a group "pushing the use of open-source
   software in government". The alliance is made up of Intel, Hewlett-Packard (HP)
   and Red Hat. It's worth noting that Intel and HP make money whether or not FLOSS
   is used.

   It's worth noting that some U.S. government agencies have specific processes for
   FLOSS. For example, [897]NASA has a program for releasing some programs as FLOSS.

   [898]Federal Computer Week's Linux Use Drives Innovation notes that FBI officials
   started a project that became the Emergency Response Network (ERN), a Linux-based
   information-sharing system specifically to support emergency responses. Jo
   Balderas, YHD Software's chief executive officer, said that by using widely-used
   FLOSS, "we can deliver fast, easy, cost-effective technology that has
   successfully addressed many of the information-sharing challenges that are
   obstacles to homeland security".

   [899]Jim Stogdill is CTO at Gestalt, and his blog "limnthis" discusses FLOSS and
   government (primarily U.S. government).

   The paper [900]Open Source and These United States by C. Justin Seiferth
   summarizes that:

     The Department of Defense can realize significant gains by the formal
     adoption, support and use of open licensed systems. We can lower costs and
     improve the quality of our systems and the speed at which they are developed.
     Open Licensing can improve the morale and retention of Airmen and improve our
     ability to defend the nation. These benefits are accessible at any point in
     the acquisition cycle and even benefit deployed and operational systems. Open
     Licensing can reduce acquisition, development, maintenance and support costs
     and increased interoperability among our own systems and those of our Allies.

   NetAction has proposed more FLOSS use and encouragement by the government; see
   [901]The Origins and Future of Open Source Software by Nathan Newman and [902]The
   Case for Government Promotion of Open Source Software by Mitch Stoltz for their
   arguments.

   More recently, [903]The U.S. Department of Defense Information Systems Agency
   (DISA) has certified Linux distributor Red Hat's Advanced Server operating system
   as a "Common Operating Environment" (COE), meaning the server product meets the
   agency's software security and interoperability specification.

   [904]The U.S. Congress' National Defense Authorization Act (2009) says: "The
   committee is concerned by the rising costs and decreasing security associated
   with software development for information technology systems. These rising costs
   are linked to the increasing complexity of software, which has also resulted in
   increasing numbers of system vulnerabilities that might be exploited by malicious
   hackers and potential adversaries. The committee encourages the department to
   rely more broadly on (open-source software) and establish it as a standard for
   intra-department software development"..

   U.S. state governments have widely used FLOSS too. The Center for Digital
   Government's 2003 "Best of the Web" awards named the top 5 state web sites as
   Utah, Maine, Indiana, Washington, and Arkansas. [905]Four of the five winning
   state web sites use FLOSS programs to implement their site. The only state in the
   top five not using FLOSS was Washington - Microsoft's home state.

   Some states, such as [906]Massachusetts, have a formal policy encouraging the use
   of open standards. It is often easier to deploy FLOSS, if you choose to do so, if
   you're already using open standards; it's much more difficult to change to either
   a proprietary or FLOSS product if you're stuck using proprietary standards.

   [907]The 2004 report of the California Performance Review, a report from the
   state of California, urges that "the state should more extensively consider use
   of open source software#&8221;, stating that FLOSS "can in many cases provide the
   same functionality as closed source software at a much lower total cost of
   ownership".

   [908]California's Air Resources Board (ARB) has had a great deal of experience
   with FLOSS; their web page on [909]ARB's Open Source Initiatives provides much
   more information.

   [910]Stanislaus County has saved significant amounts of money through smart
   migration to FLOSS programs like Linux and JBoss. Richard Robinson, the director
   of strategic business technology (not the county's CEO), once worked at Accenture
   (Anderson Consulting) and has been working hard to identify the county's needs
   and meet them. In two years, he's reduced costs in his department by 30-65%
   depending on how you measure it. In 2002, 2% of the county's servers used Linux;
   by 2004, 25% use Linux, and next year that's expected to increase to 33%.

  12.1.2 Specific Examples of FLOSS Development

   Here are some specific examples of U.S. government-developed FLOSS:
    1. GNAT (GPL + extensions), an Ada compiler. Original development was funded by
       the Air Force. [911]GNAT is now commercially supported., The license is, in
       practice, similar to the LGPL.
    2. SELinux (GPL), a major security enhancement to Linux. It is now included in
       the Linux kernel, so it's primarily supported by the Linux kernel project.
       Red Hat (and others) deploy it.
    3. Expect (public domain), a scripting language. The [912]NIST Expect site and
       [913]SourceForge Expect site have more information.
    4. There's a lot of government-sponsored OSS in the geospatial tools domain. See
       the [914]Open Source Geospatial Foundation, including the Open Source
       Software Image Map (OSSIM) project.
    5. "Workforce Connections" and EZRO (GPL). Development funded by Dept. of Labor.
       The contractor who actually developed it is DevIS. There's a nice article
       about it in the "DoD SoftwareTech News", June 2007, pp. 32-35 by Peter
       Gallagher.
    6. Evergreen, an open source, enterprise-class library management developed by
       the Georgia Public Library Service. There is a [915]Linux.com article about
       Evergreen
    7. [916]GiG Lite and [917]World Wide Consortium for the Grid (W2COG) are working
       in this direction.
    8. See [918]NASA's FLOSS programs.
    9. [919]rVooz (Apache 2.0) "is an Open and collaborative project designed to
       make dynamic connections between people who with shared context whether
       geo-spatial, structural, or any other form. rVooz is a software suite
       designed to make contextual connections, or 'contextions,' between people who
       may or may not have a priori knowledge of each other. It is designed to bring
       people together even if they don't have each other in their buddy lists or
       know each other's phone numbers". [920]Comments on rVooz.
   10. [921]Delta3D (LGPL) is a "widely used and well-supported open source game and
       simulation engine... [it is a] fully-featured game engine appropriate for a
       wide variety of uses including training, education, visualization, and
       entertainment. Delta3D is unique because it offers features specifically
       suited to the Modeling and Simulation and DoD communities such as High Level
       Architecture (HLA), After Action Review (AAR), large scale terrain support,
       and SCORM Learning Management System (LMS) integration". You can learn more
       from the [922]July 2006 issue of JDMS and [923]limnthis.
   11. A lot of U.S. government-funded research produces open source software.
       There's just too much to list. This includes the original TCP/IP suite
       developed for the BSDs, a key event that enabled the development of the
       Internet.

   [924]Hamel's "Open-Source Collaboration in the Public Sector: The Need for
   Leadership and Value" examined some FLOSS existing efforts and found
   (unsurprisingly) that leadership matters. Highlights: "Collaborations with a
   strong leadership structure, and more importantly a single leader who is
   persistent, passionate and willing to spend a great deal of time maintaining and
   improving the organization are much more likely to succeed. Value is also a
   critical component, and requires that efforts meet the wants and needs of members
   and clients, whether they be in the form of software, documentation, research or
   even policy advocacy". Focusing on a few most useful projects is critical: "a
   conscious effort to focus energy on a small number of projects in early stages
   may be an important component in creating value for members of collaborative
   efforts". A FLOSS project requires collaboration to be successful, and
   collaboration requires that the project gain the trust of potential
   users/developers; "In this research I found that leadership, face-to-face
   contact, and the legal framework were the primary factors leading to trust. A
   willingness and ability to evolve, which may be tied to creating products of
   value to clients and members, might also be an important factor in developing a
   successful collaboration". Those statements, at least, seem very sound.

   The [925]"Open Technology Development" work produced a roadmap and has a
   [926]Open Technology web site.

12.2 Europe

   The massive [927]Study on the: Economic impact of open source software on
   innovation and the competitiveness of the Information and Communication
   Technologies (ICT) sector in the EU (November 20, 2006) summarizes a vast number
   of economics-related FLOSS statistics. Here are a few of the many interesting
   conclusions it draws:
    1. "FLOSS applications are first, second or third-rung products in terms of
       market share in several markets"
    2. "The existing base of quality FLOSS applications with reasonable quality
       control and distribution would cost firms almost Euro 12 billion to reproduce
       internally. This code base has been doubling every 18-24 months over the past
       eight years, and this growth is projected to continue for several more
       years".
    3. "The existing base of FLOSS software represents a lower bound of about
       131.000 real person-years of effort"
    4. "Defined broadly, FLOSS-related services could reach a 32% share of all IT
       services by 2010, and the FLOSS-related share of the economy could reach 4%
       of European GDP by 2010".
    5. "Firms have invested an estimated Euro 1.2 billion in developing FLOSS
       software... represent in total at least 565 000 jobs and Euro 263 billion in
       annual revenue"
    6. "FLOSS potentially saves industry over 36% in software R&D investment"
    7. "Doubling the rate of FLOSS take-up in Europe would result in a software
       share of investment at 1.5% of GDP, reducing but not closing this investment
       gap with the US"

   The Interchange of Data between Administrations (IDA) programme is managed by the
   European Commission, with a mission to "coordinate the establishment of
   Trans-European telematic networks between administrations". IDA has developed a
   vast amount of FLOSS information, including an extraordinary amount of
   information specific to Europe. [928]IDA's Open Source Observatory provides a
   great deal of FLOSS background information, [929]FLOSS news, [930]European FLOSS
   case studies, [931]FLOSS events (both European and abroad), and other material.
   IDA also provides [932]The IDA Open Source Migration Guidelines to describe how
   to migrate from proprietary programs to FLOSS programs. The authors state that
   "There are many reasons for Administrations to migrate to OSS. These include: the
   need for open standards for e-Government; the level of security that OSS
   provides; the elimination of forced change; the cost of OSS. All these benefits
   result in far lower [Information Technology] costs". Another paper of interest to
   governments considering FLOSS is [933]Paul Dravis' "Open Source Software:
   Perspectives for Development", developed for the World Bank Group. The
   [934]Consortium for Open Source in the Public Administration aims to analyze the
   effects of introducing open data standards and Open Source software for personal
   productivity and document management in European public administrations.

   In 2002 an independent study was published by the European Commission. Titled
   [935]"Pooling Open Source Software", and financed by the Commission's Interchange
   of Data between Administrations (IDA) programme, it recommends creating a
   clearinghouse to which administrations could "donate" software for re-use. This
   facility would concentrate on applications specific to the needs of the public
   sector. More specifically, the study suggests that software developed for and
   owned by public administrations should be issued under an open source license,
   and states that sharing software developed for administrations could lead to
   across-the-board improvements in efficiency of the European public sector.

   [936]In October 2002, the European Commission awarded Netproject a pilot contract
   valued at EUR250,000 to examine deployment of FLOSS in government departments.

   It's worth noting that many people believe that Europe and the U.S. tend to
   approach FLOSS differently. [937]Larry Augustine went to a European conference
   and outlined what he considered to be major differences in the typical U.S. and
   European outlooks on FLOSS.

   [938]As reported in the Washington Post on November 3, 2002, Luis Millan Vazquez
   de Miguel, the minister of education, science and technology in a western region
   of Spain called Extremadura, is heading the launch of a government campaign to
   convert all the area's computer systems (in government offices, businesses and
   homes) from the Windows operating system to GNU/Linux. Vazquez de Miguel said
   over 10,000 desktop machines have already been switched, with 100,000 more
   scheduled for conversion in the next year. The regional government paid a local
   company $180,000 to create a set of freely available software, and invested in a
   development center that is creating customized software. "So far, the government
   has produced 150,000 discs with the software, and it is distributing them in
   schools, electronics stores, community centers and as inserts in newspapers. It
   has even taken out TV commercials about the benefits of free software". The Post
   also discussed some of the reasons some governments are turning to FLOSS. "Among
   the touchiest issues that Microsoft faces outside the United States is the
   uneasiness some countries have expressed about allowing an American company to
   dominate the software industry in their country. `Non-U.S. governments in
   particular view open source as a way to break the stranglehold against Microsoft.
   If Microsoft owns everything their countries, their own companies can't get a
   foothold in the software industry,' said Ted Schadler, an analyst for Forrester
   Research Inc". Some Spanish government systems and those belonging to the
   telecommunications company Telefonica recently were shifted to Linux partly
   because of security concerns. In Florence, legislators talked of breaking the
   `the computer science subjection of the Italian state to Microsoft.' "

   [939]Germany intends to increase its use of FLOSS. [940]IBM signed a Linux deal
   with Germany; Germany's Interior Minister, Otto Schilly, said the move would help
   cut costs, improve security in the nation's computer networks, and lower
   dependence on any one supplier.

   Munich, Germany (the third largest German city) has decided to [941]migrate all
   of its 14,000 computers in public administration to GNU/Linux and other FLOSS
   office applications, dropping Microsoft's Windows in the process. [942]USA Today
   gives a detailed discussion of how this decision was made. [943]Here's more
   information about the Munich approach. The GNU/Linux system bid had a somewhat
   higher cost than the lowest cost Microsoft bid, but when looking at the details,
   the claim that Microsoft was lower cost appears misleading -- Microsoft's bid was
   significantly different than the GNU/Linux bid. For example, in Microsoft's bid,
   the Windows systems wouldn't be upgraded for 6 years. Who doesn't upgrade for 6
   years? If Munich had agreed to that in 1998, in 2004 they'd still be running only
   Windows 98 and NT 4.0. Also, in Microsoft's low bid, many systems would only get
   the word processor Word, not a full office suite (GNU/Linux systems typically
   come with complete office application suites at no additional cost, important for
   people who suddenly need to read presentations and spreadsheets). Also, some have
   noted that many of the costs for the GNU/Linux approach can be viewed as a
   "removing Microsoft" cost rather than the cost of using GNU/Linux per se;
   delaying the switch could have made the cost of switching later even larger due
   to increased lock-in. It's likely, however, that this decision was made with a
   long-term view of many issues, not solely by cost.

   In 2003, the "Open Source and Open Source Software for the Dutch government"
   (OSSOS) program started. [944]By December 2006 the OSSOS program was reporting
   successes. Ten large municipalities representing 2.7 million individuals,
   including Amsterdam and The Hague, had signed a manifest. Instead of emphasizing
   open source software per se, they emphasized four goals: Supplier independence,
   interoperability, transparency and verifiability, and digital durability. While
   not flatly forbidding proprietary software, FLOSS meets these criteria better
   than proprietary software.

   In France, [945]the French police are switching from Microsoft Office to
   OpenOffice.org, according to the French industry news service Toolinux. More
   specifically, the group making this switch is the "Gendarmerie Nationale
   française", who act as police in the French countryside but are technically part
   of the French Army. According to the report, by the end of January 2005 about
   35,000 PCs and workstations will be to be equipped with the FLOSS office suite;
   by summer 2005 the number is to reach 80,000. The French police expect to save
   more than two million euros by switching.

   [946]Finnish MPs are encouraging the use of GNU/Linux in government systems.

   Statskontoret, the Swedish Agency for Public Management, has performed a
   feasibility study on free and open source software and came to very positive
   conclusions (see the report in [947]English or [948]Swedish).

   On October 10, 2002, the [949]Danish Board of Technology released a report about
   the economic potential in using Open Source software in the public
   administration. The report showed a potential savings of 3.7 billion Danish
   Kroners (500 million Euros) over four years. A pilot project in the Hanstholm
   municipality determined that switching the office suite from Microsoft Office to
   OpenOffice.org and StarOffice did not increase their number of problems and that
   each user only needed 1 to 1.5 hours of training to learn the new office suite.
   The municipality will now use OpenOffice.org and StarOffice on all workplaces
   (200 in all) and will save 300,000 Danish Kroners (about 40,000 Euros) each year
   in license fees. They will still use Microsoft Windows as their OS. [950]You may
   want to see the Danish government's report on FLOSS.

   [951]In July 2002, UK Government published a policy on the use of Open Source
   Software. This policy had the following points:
    1. UK Government will consider OSS solutions alongside proprietary ones in IT
       procurements. Contracts will be awarded on a value for money basis.
    2. UK Government will only use products for interoperability that support open
       standards and specifications in all future IT developments.
    3. UK Government will seek to avoid lock-in to proprietary IT products and
       services.
    4. UK Government will consider obtaining full rights to bespoke software code or
       [customizations] of COTS (Commercial Off The Shelf) software it procures
       wherever this achieves best value for money.
    5. UK Government will explore further the possibilities of using OSS as the
       default exploitation route for Government funded R&D software.

   As follow-on work, the United Kingdom's Office of Government Commerce (OGC)
   performed "proof of concept" trials of Open Source Software (OSS) in a range of
   public bodies. In October 2004 summarized its key findings taking into account
   information from elsewhere. Their [952]Government Open Source Software Trials
   Final Report is publicly available, and has some very interesting things to say.
   A [953]brief news article describes the report. The report concludes that:
     * Viability of OSS: Open Source software is a viable and credible alternative
       to proprietary software for infrastructure implementations, and for meeting
       the requirements of the majority of desktop users;
     * Obstacles to implementation: The main obstacles to widespread implementation
       of Open Source software are: for desktop applications, the current lack of
       complex functionality which can affect ease of migration and interoperability
       for some [organizations]; and for business applications, the lack of Open
       Source products to compete with large-scale proprietary enterprise-level
       products; no significant obstacles were noted for the adoption of Open Source
       in infrastructure developments;
     * Costs and benefits: Adoption of Open Source software can generate significant
       savings in hardware and software costs for infrastructure implementation, and
       reduce the licensing costs and hardware refresh requirements for desktop
       implementation;
     * Lessons learned: Adoption of Open Source, particularly for the desktop,
       requires investment in planning, training of users, development of skills for
       implementation and support, and detailed consideration of migration and
       interoperability issues.

   The UK report recommended that public sector bodies should:
    1. examine carefully the technical and business case for implementation of Open
       Source software and the role which OSS could play in current and future
       projects, working with their outsourced IT providers where appropriate;
    2. review the potential for server consolidation, comparing the benefits of OSS
       with proprietary solutions;
    3. consider the potential costs and benefits of migration to an OSS desktop for
       transaction users, (potentially in conjunction with use of "thin client"
       architecture solutions);
    4. identify the role of open standards in future IS/IT strategy and policy, in
       conformance with the e-Government Interoperability Framework (eGIF);
    5. consider requirements for the development of skills in Open Source
       development, deployment and operation within the [organization], and review
       the availability of such skills in their outsourced IT service providers;
    6. review their current infrastructure and applications - in collaboration with
       their outsourced IT providers where relevant - well in advance of any planned
       procurement or renewal, and determine whether current technologies and IT
       policies inhibit future choice; and if so consider what steps may be
       necessary to prevent future "lock in";
    7. consider the benefits of incremental change by diversifying OSS use beyond
       the server platform to products like Email, LDAP, Web and Internet Browser.

   In 2005 the U.K. government announced that it is backing a new initiative called
   the "Open Source Academy", which is aimed at promoting the use of open-source
   software in the public sector (by local UK governments), and providing a forum
   for those working in the public sector to test and use such software. It is
   funded by the Office of the Deputy Prime Minister (ODPM) under its e-Innovations
   investment program. One justification cited for the Open Source Academy was a
   Dutch study published in January 2005 by the Maastricht Economic Research
   Institute on Innovation and Technology, which reported that 32% of local
   authorities in the U.K. use FLOSS, compared with 71% in France, 68% in Germany,
   and 55% in the Netherlands. Andy Hopkirk, head of research and development at the
   National Computing Centre (NCC), wasn't sure that that the U.K. was "lagging so
   far behind on open source", but did admit that "There is a cultural difference
   between the U.K. and rest of the world -- the U.K. is conservative in the uptake
   of new things and has a let's-wait-and-see attitude. There is also the `not
   invented here' syndrome". There seems to be a widespread perception that U.K. use
   is lower not because the software is inappropriate, but because the U.K. local
   governments are so risk-averse that they cannot seize opportunities when they
   become available. Thus, the "Open Source Academy" has the goal of ensuring that
   local authorities know about their alternatives, and it also "provides an
   opportunity for local authorities to get the resources as well as the time and
   space to try things out without risking their own infrastructures... It's a type
   of sand-pit area". InfoWorld reported that "Participants in the Open Source
   Academy are hoping that the program will help the U.K. government catch up with
   the rest of Europe in implementing open-source software as part of government
   projects". More information on the Open Source Academy is available in the
   [954]eGov monitor and [955]InfoWorld.

12.3 South/Central America

   [956]Brazil's government is planning to switch 300,000 computers to Linux says a
   January 2005 story; [957]various activists are encouraging such a switch. Indeed,
   as noted in [958]my Travelogue on the the 6th International Free Software
   Conference (FISL 6.0) in Porto Alegre, Brazil, Brazil is hotbed of FLOSS
   activity. You can find more information in [959]O Impacto do Software Livre e de
   Código Aberto na Indústria de Software do Brasil (loosely translated, "The Impact
   of Free Software and Open Source in the Brazilian Software Industry")

   Peru has contemplated [960]passing a law requiring the use of FLOSS for public
   administration (government); rationale for doing so, besides saving money,
   include supporting "Free access to public information by the citizen, Permanence
   of public data, and the Security of the State and citizens". Dr. Edgar David
   Villanueva Nuñez (a Peruvian Congressman) has written a detailed letter
   explaining in detail the rationale for the proposed law and why he believes it is
   beneficial (and necessary) for the government. In particular, he argues that this
   is necessary to provide basic guarantees by the government: Free access to public
   information by the citizen, permanence of public data, and security of the state
   and citizens. [961]Marc Hedlund written has a brief description of the letter; an
   English translation is available (from [962]Opensource.org, [963]GNU in Peru,
   [964]UK's "The Register", and [965]Linux Today); there is a longer discussion of
   this [966]available at Slashdot. Whether or not this law passes, it is an
   interesting development.

12.4 Other Countries

   A [967]Linux Journal article notes many interesting international experiments and
   approaches, for example, Pakistan plans to install 50,000 low cost computers in
   schools and colleges all over Pakistan using GNU/Linux. [968]A June 14, 2002
   article in PC World also lists actions various governments are taking.

   The [969]Korean government announced that it plans to buy 120,000 copies of
   Hancom Linux Deluxe this year, enough to switch 23% of its installed base of
   Microsoft users to open source equivalents; by standardizing on GNU/Linux and
   HancomOffice, the Korean government expects savings of 80% compared with buying
   Microsoft products (HancomOffice isn't FLOSS, but GNU/Linux is). [970]Taiwan is
   starting a national plan to jump-start the development and use of FLOSS. The
   [971]The Ministry of Defence in Singapore has installed OpenOffice.org on 5,000
   PCs as of November 2004, and is planning to deploy it on a further 15,000 within
   the next 18 months after that.

   [972]Sun Microsystems has announced a deal with China to provide one million
   Linux desktops, and mentioned that China "has pledged to deploy 200 million
   copies of open standards-based desktop software".

   [973]South Africa's government departments are being officially encouraged to
   stop using (expensive) proprietary software, and to use FLOSS instead. This is
   according to a [974]January 15, 2003 announcement by Mojalefa Moseki, chief
   information office with the State Information Technology Agency (Sita). South
   Africa plans to save 3 billion Rands a year (approximately $338 million USD),
   increase spending on software that stays in their country, and increase
   programming skill inside the country. South Africa reports that its small-scale
   introductions have already saved them 10 million Rands (approximately $1.1
   million USD). [975]More information is available at Tectonic (see also [976]South
   African minister outlines OSS plans). [977]The state of Oregon is considering an
   FLOSS bill as well. [978]Japan has earmarked 1 billion yen for a project to boost
   operating systems other than Microsoft Windows - it is expected to be based on
   FLOSS, particularly Linux, and both South Korea and China are coordinating with
   Japan on it. [979]In December 2003, Israel's government suspended purchases of
   new versions of Microsoft office software and began actively encouraging the
   development of an open-source alternatives (especially OpenOffice.org).
   [980]Indian President A.P.J. Abdul Kalam called for his country's military to use
   FLOSS to ward off cybersecurity threats; as supreme commander of the Indian armed
   forces, this is a directive he can implement.

   [981]Sri Lanka declared the week beginning September 5, 2005, to be "Free and
   Open Source Software Week".

   Brendan Scott's [982]Research Report: Open Source and the IT Trade Deficit of
   July 2004 found that in just Australia, the costs of just the closed source
   operating system were causing an Australian trade deficit of $430 million per
   year.

   The Australian Government Information Management Office released in 2005 [983]"A
   Guide to Open Source Software for Australian Government Agencies".

   More recently, in 2008 the [984]The Australian Open Source Industry & Community
   Report 2008 was published, which is an excellent summary (including many
   quantitative figures) of the state of FLOSS in Australia. In April 2008,
   [985]Kate Lundy (Labor's ACT senator) publicly noted that Australian IT
   developers are being stifled by the dominance of US software companies that hold
   on to government business through vendor lock-in, claiming that it is "a market
   failure resulting in very little competitive tension, and very little
   innovation... the money spent on licence fees is effectively dead money because
   it's not going into innovation". Senator Lundy said departments should look at
   the annual cost of fees for the right to use software and consider whether the
   money might be better spent on developing products based on open standards; "then
   agencies can look at creating a more competitive environment that allows
   open-source software to compete, based on open standards".

   The [986]Canadian Association for Open Source is actively involved in Canadian
   policy issues, working "to protect the right of the owners of digital technology
   to make their own software choices, and further to seek to remove any legal or
   other barriers that would favour non-FLOSS software over FLOSS".

   There have been many discussions about the advantages of FLOSS in less developed
   countries. Heinz and Heinz argue in their paper [987]Proprietary Software and
   Less-Developed Countries - The Argentine Case that the way proprietary software
   is brought to market has deep and perverse negative consequences regarding the
   chances of growth for less developed countries. Danny Yee's [988]Free Software as
   Appropriate Technology argues that Free Software is an appropriate technology for
   developing countries, using simple but clear analogies. [989]Free as in
   Education: Significance of the Free/Libre and Open Source Software for Developing
   Countries, commissioned by the Finnish Ministry for Foreign Affairs, examines the
   significance of FLOSS and related concepts for developing countries. The
   non-governmental organizations OneWorld Finland and the Service Centre for
   Development Cooperation (KEPA) maintain the [990]FLOSS for Development website,
   which identifies other analyses of FLOSS to support their goal, "To find out if
   and how Free/Libre and Open Source software is useful for developing countries in
   their efforts to achieve overall development, including bridging the digital
   divide".

12.5 Countering Government Use of FLOSS

   Many proprietary companies compete with FLOSS products. The rise of competition
   in IT markets, particularly in places where there hadn't been competition before,
   has had the general beneficial effect of lowering the costs of software to
   governments. Even simply threatening to use a different supplier is often enough
   to gain concessions from all vendors, and since governments are large customers,
   they often gain large concessions. And of course all companies work to provide
   information on their products that puts them in the best possible light.
   Competing in terms of technical capabilities, cost, support, and so on is a
   normal part of government acquisition, and not further considered here.

   However, there have been some efforts (or at least perceived efforts) to prevent
   government use of FLOSS, or forbid use of the most common FLOSS license (the
   GPL). Generally these efforts have not had much success.

   As described in [991]"Geek activism" forces Congress to reconsider Open Source,
   in 2002 a letter from the U.S. Congress unrelated to FLOSS was modified by
   Representative Adam Smith from Washington state. Smith's largest campaign
   donation source is Microsoft Corporation. The modifications added statements
   strongly discouraging the use of the GPL. The letter was originally signed by 67
   Congressmen, but as [992]an Associated Press piece notes, "Smith's attack on
   open-source drew an angry response from one of the original authors of the
   letter, Rep. Tom Davis, R-Va., chairman of the House Government Reform
   subcommittee on technology and procurement policy. "We had no knowledge about
   that letter that twisted this position into a debate over the open source GPL
   issues," said Melissa Wojciak, staff director of the subcommittee. Wojciak added
   that Davis supports government funding of open-source projects". At the end,
   "Many staffers of the 67 Congressman who signed are now claiming they didn't know
   what they were signing and the letter has been withdrawn". [993]Information Week
   also picked up the story. Also in 2002, the Washington Post reported in 2002 that
   there had been an [994]aggressive lobbying effort to squelch use of FLOSS in the
   the U.S. Department of Defense. The effort didn't work; the DoD released an
   official policy of neutrality.

   So many governments have begun officially requiring that FLOSS options be
   considered, or enacting preferences for FLOSS, that Microsoft has sponsored an
   organization called the [995]Initiative for Software Choice. Many observers
   believe the real purpose of this organization is to prevent governments from
   considering the advantages or disadvantages of a software license when they
   procure software, to prevent governments from requiring consideration of FLOSS
   products, and to encourage the use of standards that inhibit the use of FLOSS.
   [996]Indeed, Microsoft has invested large sums of money to lobby against FLOSS,
   according to CIO magazine.

   An opposing group, founded by Bruce Perens, is [997]SincereChoice.org, which
   advocates that there be a "fair, competitive market for computer software, both
   proprietary and Open Source". [998]Bruce Perens has published an article
   discussing why he believes "Software Choice" is not what it first appears to be.

   This doesn't mean that governments always choose FLOSS; quite the contrary.
   Indeed, most governments are quite conservative in their application of FLOSS
   implementations. Articles such as [999]Linux in Government: In Spite of
   Endorsements, Government Linux Projects Still Treading Water and [1000]Not So
   Fast, Linux discuss some of the roadblocks and reasons governments don't use
   FLOSS in various situations.

   Interestingly, FLOSS has forced Microsoft to be more open with its code to
   various governments. [1001]Bloomberg's January 14, 2003 article "Microsoft Has
   New Plan to Share Code With Government" announces that Microsoft Corporation
   "will expand sharing of the code underlying its Windows programs to help
   governments and agencies such as Russia and the North Atlantic Treaty
   Organization (NATO) improve computer security". It notes that "Microsoft is
   facing competition from the Linux operating system, which lets customers view and
   modify its source code. In the government sector in particular, Microsoft has
   lost contracts to Linux, analysts said. More than 20 countries are looking at
   legislative proposals that mandate considering or using Linux in government
   computers... [and Microsoft has] begun to make the code available to governments,
   as well as key customers and partners, in an effort to compete with Linux".

                                  13. Other Information

   Here are some other related information sources:
    1. There are several general information sites about FLOSS or Unix that might be
       of interest, such as the [1002]Free Software Foundation (FSF), the [1003]Open
       Source Initiative website, and the [1004]Linux.org site. George Mason
       University's Exploring and Collecting History Online (ECHO) project has a
       useful collection in its material on [1005]A Free and Open History of Free
       and Open Source Software, and the [1006]Massachusetts Institute of Technology
       (MIT)'s Free / Open Source Research Community website also maintains a useful
       collection of research papers. An older paper is [1007]John Kirch's paper,
       Microsoft Windows NT Server 4.0 versus UNIX. ([1008] also archived at the
       Internet Archives). The book [1009]The Cathedral and the Bazaar by Eric
       Raymond examines FLOSS development processes and issues. A useful collection
       of many FLOSS writings, including the essay The Cathedral and the Bazaar, is
       in the [1010]Open Source Reader. Peter Wayner's book [1011]Free For All: How
       Linux and the Free Software Movement Undercut the High-tech Titans describes
       the history and rise of FLOSS, and includes interviews with many key leaders;
       the book can be either downloaded electronically without fee or purchased as
       a hardcover book. Ganesh C. Prasad has published [1012]The Practical
       Manager's Guide to Linux. [1013]Dan Kegel's "The Case for Linux in
       Universities" discusses why students need exposure to GNU/Linux at
       universities (and thus why universities should support and encourage this).
       The paper [1014]Our Open Source / Free Software Future: It's Just a Matter of
       Time argues that within the next few years, the standard de-facto OS that
       nearly everyone uses, as well as much of the commodity software in widespread
       use, will be FLOSS. You can see a collection of general information about
       FLOSS at [1015]my web page listing FLOSS references.
    2. MITRE Corporation has examined the application of FLOSS to military systems.
       Their July 2001 report, [1016]A Business Case Study of Open Source Software,
       concludes that "open source methods and products are well worth considering
       seriously in a wide range of government applications, particularly if they
       are applied with care and a solid understanding of the risks they entail. OSS
       encourages significant software development and code re-use, can provide
       important economic benefits, and has the potential for especially large
       direct and indirect cost savings for military systems that require large
       deployments of costly software products". They also recommend following the
       following steps to determine whether to use OSS or proprietary products:
       assess the supporting OSS developer community, examine the market, conduct a
       specific analysis of benefits and risks, compare the long-term costs, and
       choose your strategy. MITRE has received a Leadership Award from the
       non-profit Potomac Forum for showing that OSS can provide substantial
       advantages over proprietary software, particularly when reliability and
       long-term support are key requirements.
       After that, in the [1017]Washington Post article Open-source Fight Flares at
       Pentagon, it was reported that "Microsoft Corp. is aggressively lobbying the
       Pentagon to squelch its growing use of freely distributed computer software
       and switch to proprietary systems such as those sold by the software giant,
       according to officials familiar with the campaign..". But the effort
       backfired.
       MITRE Corporation report, presumably in response to such efforts, prepared a
       second report at the request of the Department of Defense (DoD) Defense
       Information Systems Agency (DISA). The report was titled [1018]"Use of Free
       and Open Source Software in the US Dept. of Defense" and was originally dated
       May 10, 2002, publicly released on October 28, 2002, and was updated slightly
       in 2003. This report concluded that FLOSS use in the DoD is widespread and
       should be expanded. This MITRE report concluded that "banning [FLOSS] would
       have immediate, broad, and strongly negative impacts on the ability of many
       sensitive and security-focused DoD groups to defend against cyberattacks".
       The report also found that the GPL so dominates in DoD applications that a
       ban on just the GPL would have the same strongly negative impacts as banning
       all FLOSS. MITRE noted that FLOSS "plays a far more critical role in the DoD
       than has been generally recognized". In a two-week survey period MITRE
       identified a total of 115 FOSS applications and 251 examples of their use.
       MITRE concluded that "Neither the survey nor the analysis supports the
       premise that banning or seriously restricting [FLOSS] would benefit DoD
       security or defensive capabilities. To the contrary, the combination of an
       ambiguous status and largely ungrounded fears that it cannot be used with
       other types of software are keeping [FLOSS] from reaching optimal levels of
       use". It short, MITRE found that FLOSS is widely used, and should be even
       more widely used. On May 28, 2003, [1019]the DoD issued a formal memo placing
       FLOSS on a level playing field with proprietary software (titled simply "Open
       Source Software (OSS) in the DoD" ), without imposing any additional barriers
       beyond those already leveled on its software.
       The Post article also noted that "at the Census Bureau, programmers used
       open-source software to launch a Web site for obtaining federal statistics
       for $47,000, bureau officials said. It would have cost $358,000 if
       proprietary software were used".
    3. The European [1020]Free/Libre and Open Source Software (FLOSS): Survey and
       Study is a large multi-part report examining FLOSS from a number of different
       vantage points. The report is divided into the following (besides its summary
       and raw data):
          + Part I: Use of Open Source Software in Firms and Public Institutions,
          + Part II: Firms' Open Source Activities: Motivations and Policy
            Implications
          + Part II B: Open Source Software in the Public Sector: Policy within the
            European Union
          + Part III: Basics of Open Source Software Markets and Business Models
          + Part IV: Survey of Developers
          + Part V: Source Code Survey
    4. [1021]Computer Sciences Corporation (CSC) released in 2004 the large paper
       Open Source: Open for Business reporting many advantages to employing FLOSS.
    5. Microsoft has been trying to claim that open source is somehow dangerous, and
       indeed is its leading critic, yet the Wall Street Journal's Lee Gomes found
       that "Microsoft Uses Open-Source Code Despite Denying Use of such Software".
       Here are some interesting quotes from his article:

     ... But Microsoft's statements Friday suggest the company has itself been
     taking advantage of the very technology it has insisted would bring dire
     consequences to others. "I am appalled at the way Microsoft bashes open source
     on the one hand, while depending on it for its business on the other," said
     Marshall Kirk McKusick, a leader of the FreeBSD development team.
       More recently Microsoft has targeted the GPL license rather than all FLOSS
       licenses, claiming that the GPL is somehow anti-commercial. But this claim
       lacks evidence, given the many commercial companies (e.g., IBM, Sun, and Red
       Hat) who are using the GPL. Also, see this paper's earlier note that
       [1022]Microsoft itself makes money by selling a product with GPL'ed
       components. The same article closes with this statement:

     In its campaign against open-source, Microsoft has been unable to come up with
     examples of companies being harmed by it. One reason, said Eric von Hippel, a
     Massachusetts Institute of Technology professor who heads up a research effort
     in the field, is that virtually all the available evidence suggests that open
     source is "a huge advantage" to companies. "They are able to build on a common
     standard that is not owned by anyone," he said. "With Windows, Microsoft owns
     them".
       ([1023]Eric von Hippel gave an interview in 2008.) Other related articles
       include [1024]Bruce Peren's comments, [1025]Ganesh Prasad's How Does the
       Capitalist View Open Source?, and the open letter [1026]Free Software Leaders
       Stand Together.
    6. Indeed, many who have analyzed general information technology (IT) trends or
       Microsoft's actions have concluded that strongly depending on Microsoft's
       products is now a dangerous strategy. [1027]2003 And Beyond by Andrew Grygus
       examines the IT industry from a small business point of view, and identifies
       a large number of dangers from depending on a Microsoft-based infrastructure.
       Fundamentally, Microsoft is working hard to increase customer dependency, and
       charges exorbitantly once the customer cannot practically switch.
    7. Microsoft inadvertently advocated FLOSS in leaked documents called the
       [1028]"Halloween" documents. The original first two Halloween documents found
       that FLOSS was far more effective than they wished to admit. [1029]Halloween
       7 gives results of one of their surveys, again, with many positive comments
       about FLOSS.
    8. Another leaked internal Microsoft document is [1030]Converting a UNIX .COM
       Site to Windows (by David Brooks). This document describes lessons learned
       when converting Hotmail from the FLOSS FreeBSD to Microsoft Windows after
       Microsoft purchased Hotmail, including advantages and disadvantages of each
       approach, and ends up identifying a large number of advantages of their
       competition. For example, it noted that "entrepreneurs in the startup world
       are generally familiar with one version of UNIX (usually through college
       education), and training in one easily converts to another". [1031]An article
       in The Register summarizes many of the advantages of the Unix approach given
       in the paper.
    9. Several documents were written to counter Microsoft's statements such as
       those in Microsoft's "Linux Myths". This includes [1032]LWN's response and
       [1033]Jamin Philip Gray's response, and the [1034]FUD-counter site. The
       [1035]shared source page argues that Microsoft's "shared source" idea is
       inferior to open source. [1036]Richard Stallman's The GNU GPL and the
       American Way counters the amusing claim by Microsoft that the GPL was
       "un-American". The letter [1037]Free Software Leaders Stand Together argues
       against the statements by Craig Mundie. You can find many general sites about
       Microsoft, including [1038]Cloweth's site.
   10. In a story full of ironies, in 2002 [1039]Microsoft and Unisys teamed up in a
       well-funded marketing campaign against Unix, in part to try to revive Unisys'
       sagging sales of Windows-based products. The 18-month, $25 million campaign,
       dubbed "We have the Way Out," specifically attacked the Unix offerings of
       Sun, IBM, and Hewlett-Packard, but since the major FLOSS OSes are Unix or
       Unix-like, it attacks them as well. In a delicious irony, it was revealed
       that [1040]the anti-Unix campaign website is powered by Unix software - in
       this case, FreeBSD (an FLOSS version of Unix) and the FLOSS Web server
       Apache. Once this was publicly revealed, Microsoft and Unisys quickly
       switched to a Windows-based system.. and then [1041]the website failed to
       operate at all for several days. If that wasn't enough, [1042]Andrew Orlowski
       reported in The Register a further analysis of this website, noting that port
       3306 was open on their website - a port primarily used by MySQL and Postgres.
       In other words, it appears that their anti-Unix site was still using FLOSS
       software (not Microsoft's own database) that is primarily deployed on
       Unix-like systems. Even their original imagery turns out to have had serious
       problems; the campaign's original graphic showed a floor almost wholly
       covered in mauve paint (Sun Microsystem's color), and the alternative offered
       was to jump through a window. [1043]Many literate readers will recognize this
       symbol (the act of throwing out through, or of being thrown out of, a window)
       as defenestration, a way of killing rulers and also a popular way of inviting
       kings to commit suicide in 17th century Europe. In other words, this imagery
       suggests that you should use the window[s] to commit suicide (!). [1044]Leon
       Brooks then analyzed the site further - and found that the "way out" site
       used JSP (a technology fathered by Sun, Unix specialists). He also found that
       the site violated many standards; the site's content failed the W3C
       validation suites (Microsoft is a member of the W3C), and uses a Windows-only
       character set that is not only non-standard, but actively conflicts with an
       important international standard (and ironically one which Microsoft is
       actively promoting). If using only Windows is so wonderful, why can't the
       advocacy site conform to international standards? The real problem here, of
       course, is that trying to convince people that Unix is to be avoided at all
       costs - while using Unix and then having serious problems when trying to use
       an alternative - is both ironic and somewhat hypocritical.
       But by August 2004, [1045]Unisys decided to adopt Linux on its ES7000 Intel
       processor-based servers, responding to customer demand. In a 2005 interview,
       Unisys' Steve Rawsthorn admitted "Not having Linux in our kitbag precluded us
       from some bids... It got to the point we were being asked for it [Linux], and
       we had to do it".
   11. [1046]"How Big Blue Fell For Linux" is an article on how IBM transitioned to
       becoming a major backer. IBM announced that it planned to invest $1 Billion
       in GNU/Linux in 2001 all by itself (see the [1047]IBM annual report). In 2002
       [1048]IBM reported that they had already made almost all of the money back; I
       and others are a little skeptical of these claims, but it's clear that IBM
       has significantly invested in GNU/Linux and seem to be pleased with the
       results (for an example, see their [1049]Linux-only mainframe). This is not
       just a friendly gesture, of course; companies like [1050]IBM view FLOSS
       software as a competitive advantage, because FLOSS frees them from control by
       another organization, and it also enables customers to switch to IBM products
       and services (who were formerly locked into competitor's products).
       Thankfully, this is a good deal for consumers too. In 2002, IBM had [1051]250
       employees working full time to improve Linux.
   12. For a scientifically unworthy but really funny look at what people who use
       the various OSes say, take a look at the [1052]Operating System
       Sucks-Rules-O-Meter. It counts how many web pages make statements like "Linux
       rocks". It's really barely an opinion poll, but if nothing else it's great
       for a laugh.
   13. There have been several academic studies of FLOSS. [1053]Stefan Koch
       maintains a Free/Open Source Software Academic Bibliography which has
       pointers to many. One academic study is [1054]"A Framework for Open Source
       Projects" (a Master Thesis in Computer Science by Gregor J. Rothfuss)
       describes a framework for describing Open Source projects, introducing
       notions of actors, roles, areas, processes and tools, and depicts their
       interrelationships. The goal was to provide a conceptual foundation and a
       help for organizing and managing Open Source projects.
   14. Several studies examine developers (instead of the programs they write),
       including [1055]"A Quantitative Profile of a Community of Open Source Linux
       Developers", [1056]Herman, Hertel and Niedner's study (based on
       questionnaires), and the [1057]Who Is Doing It (WIDI) study. The European
       [1058]Free/Libre and Open Source Software Survey (FLOSS) has a large amount
       of information on developers. The paper [1059]Two Case Studies of Open Source
       Software Development: Apache and Mozilla examines two major open source
       projects, the Apache web server and the Mozilla browser, and using archives
       (such as source code change history and problem reports) they quantify
       aspects of developer participation, core team size, code ownership,
       productivity, defect density, and problem resolution intervals for these
       projects. The [1060]Boston Consulting Group/OSDN Hacker Survey (release 0.73,
       July 21, 2002) made some interesting observations by sampling SourceForge
       users. For example, it gives evidence that open source developers can be
       divided into four groups (based on their motivations for writing FLOSS
       software):
         a. Believers (19%): believe source code should be open.
         b. Learning and Fun (29%): for non-work needs and intellectual stimulation.
         c. Hobbyists (27%): need the code for a non-work reason.
         d. Professionals (25%): for work needs and professional status.
       Journalists sometimes like to romanticize FLOSS developers as being mostly
       teenage boys with little experience, but the survey didn't support that view.
       Young people doing important development is certainly a great story, and it
       certainly happens. For example, [1061]13-year-old Elizabeth Garbee will give
       a presentation on extending Tuxracer. But the study found that the open
       source developers surveyed are mostly experienced professionals, having an
       average of 11 years of programming experience; the average age was 28.
       The paper [1062]"Altruistic individuals, selfish firms? The structure of
       motivation in Open Source Software" by Andrea Bonaccorsi and Cristina Rossi
       (First Monday, January 2004) discusses a 2002 survey of 146 Italian firms
       supplying FLOSS, and compared that with surveys of individual programmers. It
       found significant differences between motivations of individuals and firms,
       with firms emphasizing economic and technological reasons. The top reasons
       (in order) of FLOSS-supplying firms were (1) because OSS allows small
       enterprises to afford innovation, (2) because contributions and feedback from
       the Free Software community are very useful in fixing bugs and improving
       software, (3) because of the reliability and quality of OSS, (4) because the
       firm wants to be independent of the price and licence policies of large
       software companies, and (5) because we agree with the values of the Free
       Software movement.
   15. If you determine that you wish to start an FLOSS project, there are some
       documents available to aid you. This includes the [1063]Free Software Project
       Management HOWTO and [1064]Software Release Practice HOWTO. You should also
       read [1065]The Cathedral and the Bazaar.
   16. Other evaluations include the [1066]Gartner Group and [1067]GNet evaluations.

   For general information on FLOSS, see my [1068]list of Open Source Software /
   Free Software (FLOSS) references at http://dwheeler.com/oss_fs_refs.html

                                     14. Conclusions

   FLOSS has significant [1069]market share in many markets, is often the most
   [1070]reliable software, and in many cases has the best [1071]performance. FLOSS
   [1072]scales, both in problem size and project size. FLOSS software often has far
   better [1073]security, perhaps due to the possibility of worldwide review.
   [1074]Total cost of ownership for FLOSS is often far less than proprietary
   software, especially as the number of platforms increases. These statements are
   not merely opinions; these effects can be shown quantitatively, using a wide
   variety of measures. This doesn't even consider [1075]other issues that are hard
   to measure, such as freedom from control by a single source, freedom from
   licensing management (with its accompanying risk of audit and litigation),
   [1076]Organizations can transition to FLOSS in part or in stages, which for many
   is a far more practical transition approach.

   Realizing these potential FLOSS benefits may require approaching problems in a
   different way. This might include using thin clients, deploying a solution by
   adding a feature to an FLOSS product, and understanding the differences between
   the proprietary and FLOSS models. Acquisition processes may need to change to
   include specifically identifying FLOSS alternatives, since simply putting out a
   "request for proposal" may not yield all the viable candidates. FLOSS products
   are not the best technical choice in all cases, of course; even organizations
   which strongly prefer FLOSS generally have some sort of waiver process for
   proprietary programs. However, it's clear that considering FLOSS alternatives can
   be beneficial.

   Of course, before deploying any program you need to evaluate how well it meets
   your needs, and some organizations do not know how to evaluate FLOSS programs. If
   this describes your circumstance, you may wish to look at the companion articles
   [1077]How to Evaluate FLOSS Programs and the [1078]Generally Recognized as Mature
   (GRAM) list.

   This paper cannot possibly list all the possible FLOSS programs that may be of
   interest to you. However, users of Windows who are looking for desktop software
   often try programs such as [1079]OpenOffice.org (FLOSS office suite),
   [1080]Firefox (FLOSS web browser), and [1081]Thunderbird (FLOSS mail browser).
   Projects like [1082]The OpenDisc project (formerly [1083]The OpenCD project)
   create CDs or DVDs that include those (and other) FLOSS programs for Windows with
   nice installers and so on. Many FLOSS programs aren't available for Windows,
   though, or do not work as well on Windows. Those interested in trying out
   GNU/Linux operating system often start with a simple CD that doesn't touch their
   hard drive, such as [1084]Gnoppix or [1085]Knoppix. They then move on to various
   Linux distributions such as Red Hat (inexpensive [1086]Fedora Core or
   professionally-supported [1087]Red Hat Enterprise Linux), [1088]Novell/SuSE,
   [1089]Mandriva (formerly MandrakeSoft), or [1090]Ubuntu (nontechnical users may
   also be interested in pay-per-month distributions like [1091]Linspire, while
   technically knowledgeable users may be interested in distributions like
   [1092]Debian).

   FLOSS options should be carefully considered any time software or computer
   hardware is needed. Organizations should ensure that their policies encourage,
   and not discourage, examining FLOSS approaches when they need software.
     ____________________________________________________________________________

              Appendix A. About Open Source Software / Free Software (FLOSS)

   This appendix gives more information about open source software / free software
   (FLOSS): [1093]definitions related to FLOSS, (of source code, free software, open
   source software, and various movements), [1094]motivations of developers and
   developing companies, [1095]history, [1096]license types, [1097]FLOSS project
   management approaches, and [1098]forking.

A.1 Definitions

   There are official definitions for the terms "Free Software" (as the term is used
   in this text) and "open source software". However, understanding a few
   fundamentals about computer software is necessary before these definitions make
   sense. Software developers create computer programs by writing text, called
   "source code," in a specialized language. This source code is often mechanically
   translated into a format that the computer can run. As long as the program
   doesn't need to be changed (say, to support new requirements or be used on a
   newer computer), users don't necessarily need the source code. However, changing
   what the program does usually requires possession and permission to change the
   source code. In other words, whoever legally controls the source code controls
   what the program can and cannot do. Users without source code often cannot have
   the program changed to do what they want or have it ported to a different kind of
   computer.

   The next two sections give the official definitions of Free Software and Open
   Source Software (though in practice, the two definitions are essentially the same
   thing); I then discuss some related definitions, and contrast the terms "Free
   Software" and "Open Source Software".

  A.1.1 Definition of Free Software

   FLOSS programs have existed since digital computers were invented, but beginning
   in the 1980s, people began to try capture the concept in words. The two main
   definitions used are the "free software definition" (for free software) and the
   "open source definition" (for open source software). Software meeting one
   definition usually meets the other as well. Since the term "free software" came
   first, we'll examine its definition first.

   The [1099]Free Software Definition is published by Richard Stallman's Free
   Software Foundation. Here is the key text of that definition:

     "Free software" is a matter of liberty, not price. To understand the concept,
     you should think of "free" as in "free speech," not as in "free beer". Free
     software is a matter of the users' freedom to run, copy, distribute, study,
     change and improve the software. More precisely, it refers to four kinds of
     freedom, for the users of the software:
     * The freedom to run the program, for any purpose (freedom 0).
     * The freedom to study how the program works, and adapt it to your needs
       (freedom 1). Access to the source code is a precondition for this.
     * The freedom to redistribute copies so you can help your neighbor (freedom 2).
     * The freedom to improve the program, and release your improvements to the
       public, so that the whole community benefits. (freedom 3). Access to the
       source code is a precondition for this.

     A program is free software if users have all of these freedoms. Thus, you
     should be free to redistribute copies, either with or without modifications,
     either gratis or charging a fee for distribution, to anyone anywhere. Being
     free to do these things means (among other things) that you do not have to ask
     or pay for permission. You should also have the freedom to make modifications
     and use them privately in your own work or play, without even mentioning that
     they exist. If you do publish your changes, you should not be required to
     notify anyone in particular, or in any particular way. The freedom to use a
     program means the freedom for any kind of person or organization to use it on
     any kind of computer system, for any kind of overall job, and without being
     required to communicate subsequently with the developer or any other specific
     entity.

   The text defining "free software" is actually much longer, explaining further the
   approach. It notes that "Free software does not mean non-commercial. A free
   program must be available for commercial use, commercial development, and
   commercial distribution. Commercial development of free software is no longer
   unusual; such free commercial software is very important".

   Many people emphasize the freedom to choose between software applications, but
   there's also been a lot of discussion noting that when one person makes a choice,
   it can often reduce the freedom of others to make choices. [1100]Editorial: The
   fifth freedom gives one perspective on this.

  A.1.2 The Open Source Definition

   Open source software is officially defined by the [1101]open source definition:

     Open source doesn't just mean access to the source code. The distribution
     terms of open-source software must comply with the following criteria:

     1. Free Redistribution

     The license shall not restrict any party from selling or giving away the
     software as a component of an aggregate software distribution containing
     programs from several different sources. The license shall not require a
     royalty or other fee for such sale.

     2. Source Code

     The program must include source code, and must allow distribution in source
     code as well as compiled form. Where some form of a product is not distributed
     with source code, there must be a well-publicized means of obtaining the
     source code for no more than a reasonable reproduction cost preferably,
     downloading via the Internet without charge. The source code must be the
     preferred form in which a programmer would modify the program. Deliberately
     obfuscated source code is not allowed. Intermediate forms such as the output
     of a preprocessor or translator are not allowed.

     3. Derived Works

     The license must allow modifications and derived works, and must allow them to
     be distributed under the same terms as the license of the original software.

     4. Integrity of The Author's Source Code

     The license may restrict source-code from being distributed in modified form
     only if the license allows the distribution of "patch files" with the source
     code for the purpose of modifying the program at build time. The license must
     explicitly permit distribution of software built from modified source code.
     The license may require derived works to carry a different name or version
     number from the original software.

     5. No Discrimination Against Persons or Groups

     The license must not discriminate against any person or group of persons.

     6. No Discrimination Against Fields of Endeavor

     The license must not restrict anyone from making use of the program in a
     specific field of endeavor. For example, it may not restrict the program from
     being used in a business, or from being used for genetic research.

     7. Distribution of License

     The rights attached to the program must apply to all to whom the program is
     redistributed without the need for execution of an additional license by those
     parties.

     8. License Must Not Be Specific to a Product

     The rights attached to the program must not depend on the program's being part
     of a particular software distribution. If the program is extracted from that
     distribution and used or distributed within the terms of the program's
     license, all parties to whom the program is redistributed should have the same
     rights as those that are granted in conjunction with the original software
     distribution.

     9. The License Must Not Restrict Other Software

     The license must not place restrictions on other software that is distributed
     along with the licensed software. For example, the license must not insist
     that all other programs distributed on the same medium must be open-source
     software.

     10. No provision of the license may be predicated on any individual technology
     or style of interface.

  A.1.3 Other Related Definitions and License Issues

   Sometimes it's useful to talk about software whose source can be viewed, but
   which do not meet the requirements of the Free Software Definition or Open Source
   Definition. A common phrase is "open box software", popularized by people such as
   [1102]John Viega, though this term is also sometimes used as a synonym. Another
   phrase is "source-viewable software", which is probably the clearest.

   The Open Source Definition was actually derived from the [1103]Debian Free
   Software Guidelines (DFSG); those original guidelines are still maintained and
   used by the widely-used and influential Debian project. Thus, the Debian
   guidelines are nearly identical to the Open Source Definition, yet Debian tends
   to use the term "Free Software" in its materials.

   In addition, the debian-legal mailing list discusses licensing issues in great
   depth, in an effort to evaluate licenses based on the freedoms they grant or do
   not grant. [1104]The DFSG and Software License FAQ states that "The DFSG is not a
   contract. This means that if you think you've found a loophole in the DFSG then
   you don't quite understand how this works. The DFSG is a potentially imperfect
   attempt to express what free software means to Debian".

   [1105]The DFSG and Software License FAQ also defines three additional "tests"
   used on the debian-legal mailing list to help them evaluate whether or not a
   license is "Free" (as in freedom). These tests aren't the final word, but because
   they're described as scenarios, they are sometimes easier for people to
   understand (and I quote the Debian FAQ here):
    1. The Desert Island test. Imagine a castaway on a desert island with a
       solar-powered computer. This would make it impossible to fulfill any
       requirement to make changes publicly available or to send patches to some
       particular place. This holds even if such requirements are only upon request,
       as the castaway might be able to receive messages but be unable to send them.
       To be Free, software must be modifiable by this unfortunate castaway, who
       must also be able to legally share modifications with friends on the island.
    2. The Dissident test. Consider a dissident in a totalitarian state who wishes
       to share a modified bit of software with fellow dissidents, but does not wish
       to reveal the identity of the modifier, or directly reveal the modifications
       themselves, or even possession of the program, to the government. Any
       requirement for sending source modifications to anyone other than the
       recipient of the modified binary - in fact any forced distribution at all,
       beyond giving source to those who receive a copy of the binary - would put
       the dissident in danger. For Debian to consider software Free it must not
       require any such excess distribution.
    3. The Tentacles of Evil test. Imagine that the author is hired by a large evil
       corporation and, now in their thrall, attempts to do the worst to the users
       of the program: to make their lives miserable, to make them stop using the
       program, to expose them to legal liability, to make the program non-Free, to
       discover their secrets, etc. The same can happen to a corporation bought out
       by a larger corporation bent on destroying Free software in order to maintain
       its monopoly and extend its evil empire. The license cannot allow even the
       author to take away the required freedoms!

   And there are practical issues that arise too:
    1. GPL compatibility is very desirable. The GPL is by far the most popular FLOSS
       license. Thus, an FLOSS license that isn't compatible with the GPL causes
       many practical problems, because the vast amount of GPL software can't be
       combined with it. Indeed, if a specification cannot be implemented by
       software released under the GPL, it essentially discriminates against FLOSS
       business models in general because so much FLOSS is released under the GPL.
       Choosing a GPL-compatible license (such as the BSD-new, MIT/X, LGPL, or GPL
       license) is often the safest course. [1106]See my paper for more information
       on why selecting a GPL-compatible license is important for FLOSS projects.
    2. Having many FLOSS licenses ("license proliferation") is undesirable.
       [1107]Bruce Perens' article "The Open Source Definition" explained back in
       1999 that "Do not write a new license if it is possible to use one of [small
       set of common licenses listed in the paper]. The propagation of many
       different and incompatible licenses works to the detriment of Open Source
       software because fragments of one program cannot be used in another program
       with an incompatible license". New licenses also make it hard for customers
       and developers to understand what their requirements are. More recently,
       there have been increasingly active steps to discourage creating new FLOSS
       licenses (which are typically corporate vanity licenses), instead of using
       one of a small set of licenses that are already in wide use. For more
       information, see comments by [1108]Danese Cooper (Intel and
       secretary/treasurer for the Open Source Initiative (OSI)) and [1109]Chris
       DiBona (Google), as well as the article [1110]"HP exec calls for fewer
       open-source licenses" by Robert McMillan (ComputerWorld, August 6, 2004).
    3. Choice-of-law and choice-of-venue requirements are very undesirable. Many
       developers strongly object to licenses that specify that the licensee must
       agree to be judged by the laws of a specific jurisdiction and/or be judged at
       a specific location. This was a key problem, for example, for the older
       Python licenses. The problem is that choice-of-law and choice-of-venue
       requirements create superfluous incompatibilities with any other licenses
       with choice-of-law and/or choice-of-venue restrictions (which would, in
       practice, always be different from each other). A goal of FLOSS licenses is
       to allow software to be combined and modified in new, innovative ways, and
       such statements interfere with that goal.
    4. Advertizing clauses are very undesirable. Some old licenses, like the old BSD
       license, required that credit be given to developers in certain ways, e.g.,
       whenever a product is advertized. When there's only one developer, that
       doesn't sound too bad. But imagine what happens as more developers get
       involved -- suddenly each advertisement has to individually list (say) 20,000
       people! These kinds of licenses don't scale well as more people become
       involved, and major FLOSS projects can involve large numbers of developers.
       Crediting developers in the source code is very common practice, of course,
       but that's not the same thing.

   A technical discussion examining the freedom of a license might compare the
   license against the Free Software Definition (all four freedoms), the Open Source
   Definition (every point) and/or the Debian Free Software Guidelines, and the
   tests (scenarios) above, as well as considering practical concerns like the ones
   above. An example of such analysis is [1111]Mark Shewmaker's August 2004
   examination of the Microsoft Royalty Free Sender ID Patent License.

  A.1.4 Open Source Movement and Free Software Movement

   As a practical matter, the definitions given above for free software and open
   source software are essentially the same. Software meeting the criteria for one
   generally end up meeting the other definition as well; indeed, those who
   established the term "open source" describe their approach as marketing approach
   to Free Software. However, to some people, the connotations and motives are
   different between the two terms.

   Some people who prefer to use the term "free software" intend to emphasize that
   software should always meet such criteria for ethical, moral, or social reasons,
   emphasizing that these should be the rights of every software user. Such people
   may identify themselves as members of the "free software movement". Richard
   Stallman is a leader of this group; his arguments are given in his article
   [1112]Why "Free Software" is better than "Open Source"

   Some people are not persuaded by these arguments, or may believe the arguments
   but do not think that they are effective arguments for convincing others.
   Instead, they prefer to argue the value of FLOSS on other grounds, such as cost,
   security, or reliability. Many of these people will prefer to use the term "open
   source software", and some may identify themselves as part of the "open source
   movement". Eric Raymond was one of the original instigators of the name "open
   source" and is widely regarded as a leader of this group.

   Is the "free software movement" a subset of the "open source movement"? That
   depends on how the "open source movement" is defined. If the "open source
   movement" is a general term describing anyone who supports OSS or FS for whatever
   reason, then the "free software movement" is indeed a subset of the "open source
   movement". However, some leaders of the open source movement (such as Eric
   Raymond) specifically recommend not discussing user freedoms, and since this is
   the central principle of the free software movement, the two movements are
   considered separate groups by many.

   The [1113]Free/Libre and Open Source Software Survey (FLOSS), part IV, summarizes
   a survey of FLOSS developers (primarily European developers), and specifically
   examined these terms. In this study, 48.0% identified themselves as part of the
   "Free Software", community, 32.6% identified themselves as part of the "open
   source" community, and 13.4% stated that they did not care. A slight majority
   (52.9%) claimed that the movements different in principle, but the work is the
   same, while 29.7% argued that the movements were fundamentally different, and
   17.3% do not care at all about the differences. After examining the data, the
   surveyors determined that FLOSS developers could be divided into six groups:
    1. developers who assign themselves to the Free Software community and who see
       fundamental differences between the two communities (18%).
    2. developers who consider themselves as part of the Open Source community and
       who perceive fundamental differences between the two communities (9%).
    3. developers who assign themselves to the Free Software community and who
       perceive only principle differences between the two communities, but consider
       work in the two communities the same (26%).
    4. developers who assign themselves to the Open Source community and see
       principle, but no fundamental differences between the two communities (17%).
    5. developers who assign themselves to either the Free Software or the Open
       Source Software community, but are not bothered by differences between the
       two communities (9%).
    6. developers who do not care to which community they belong (20%).

   This difference in terminology and motivation can make it more difficult for
   authors of articles on FLOSS (like this one). The motivations of the different
   movements may be different, but since practice the developers usually work
   together, it's very useful to have a common term that covers all groups. Some
   authors choose to use one of the terms (such as OSS). Other authors use some
   other term merging the two motivations, but as of this time there is no single
   merged term used by everyone. This article uses the merged term FLOSS.

A.2 Motivations

   This leads to a more general and oft-asked question: "Why do developers
   contribute to FLOSS projects?" The short answer is that there are many different
   motivations.

   The [1114]Boston Consulting Group/OSDN Hacker Survey (release 0.73, July 21,
   2002) made some interesting observations by sampling SourceForge users. The top
   motivations given for participating in FLOSS development were as follows:
    1. intellectually stimulating (44.9%)
    2. improves skill (41.3%)
    3. work functionality (33.8%)
    4. code should be open (33.1%)
    5. non-work functionality (29.7%)
    6. obligation from use (28.5%)

   By examining these motivations, they concluded that open source developers could
   be divided into four groups (based on their primary motivations for writing FLOSS
   software):
    a. Believers (19%): believe source code should be open.
    b. Learning and Fun (29%): for non-work needs and intellectual stimulation.
    c. Hobbyists (27%): need the code for a non-work reason.
    d. Professionals (25%): for work needs and professional status.

   Part IV of the [1115]Free/Libre and Open Source Software Survey (FLOSS),
   mentioned above, also examined individual developer motivations, and found a
   variety of motivations.

   Many businesses contribute to FLOSS development, and their motivations also vary.
   Many companies develop FLOSS to sell support - by giving away the product, they
   expect to get far more support contracts. [1116]Joel Spolsky's "Strategy Letter
   V" notes that "most of the companies spending big money to develop open source
   software are doing it because it's a good business strategy for them". His
   argument is based on microeconomics, in particular, that every product in the
   marketplace has substitutes and complements. A substitute is another product you
   might buy if the first product is too costly, while a complement is a product
   that you usually buy together with another product. Since demand for a product
   increases when the prices of its complements decrease, smart companies try to
   commoditize their products' complements. For many companies, supporting an FLOSS
   product turns a complementary product into a commodity, resulting in more sales
   (and money) for them.

   One widely-read essay discussing commercial motivations is Eric Raymond's
   [1117]The Magic Cauldron. The European [1118]Free/Libre and Open Source Software
   (FLOSS): Survey and Study has additional statistics on the motivations of
   individuals and corporations who develop FLOSS.

A.3 History

   In the early days of computing (approximately 1945 to 1975), computer programs
   were often shared among developers, just as FLOSS practitioners do now.

   Steven M. Bellovin reported (in 2006), "I do remember SHARE, early Usenix, DECUS,
   and more, where sharing software and system modifications was a way of life.
   Indeed, `bring a blank tape' was the standard advice when attending early Usenix
   meetings... The notion of closed source software products didn't really catch on
   until IBM unbundled its operating system from its hardware sales, in the
   early-to-mid 1970s. Before that, though systems weren't what we know as open
   source, anyone who cared had the source tapes, and changes were freely
   distributed".

   At a 1965 Fall Joint Computer Conference, the paper [1119]Introduction and
   Overview of the Multics System by F. J. Corbato' and V. A. Vyssotsky made this
   interesting statement about the Multics system, whose software cost an
   extraordinary amount of money to develop for the time:

     "It is expected that the Multics system will be published when it is operating
     substantially. ... Such publication is desirable or two reasons: First, the
     system should withstand public scrutiny and criticism volunteered by
     interested readers; second, in an age of increasing complexity, it is an
     obligation to present and future system designers to make the inner operating
     system as lucid as possible so as to reveal the basic system issues. ... it is
     presumptuous to think that the initial system can successfully meet all the
     requirements that have been set. The system will evolve under the influence of
     the users and their activities for a long time and in directions which are
     hard to predict at this time... It is expected that most of the system
     additions will come from the users themselves and the system will eventually
     become the repository of the procedure and data knowledge of the community".

   While the Multics software wasn't released at that time, their rationale for
   recommending it is intriguingly consonant with the reasons many FLOSS programs
   are released today.

   An important development to FLOSS was the start of the ARPAnet, the early form of
   the Internet. Another critical development was the operating system Unix,
   developed by AT&T researchers, and distributed as source code (with modification
   rights) for a nominal fee. Indeed, the interfaces for Unix eventually became the
   basis of the POSIX suite of standards.

   However, as years progressed, and especially in the 1970s and 1980s, software
   developers increasingly closed off their software source code from users. This
   included the Unix system itself; many had grown accustomed to the freedom of
   having the Unix source code, but AT&T suddenly increased fees and limited
   distribution, making it impossible for many users to change the software they
   used and share those modifications with others.

   Richard Stallman, a researcher at the MIT Artificial Intelligence Lab, found this
   closing of software source code intolerable. In 1984 he started the GNU project
   to develop a complete Unix-like operating system which would be Free Software
   (free as in freedom, not as in price, as described above). In 1985, Stallman
   established the Free Software Foundation (FSF) to work to preserve, protect and
   promote Free Software; the FSF then became the primary organizational sponsor of
   the GNU Project. The GNU project developed many important software programs,
   including the GNU C compiler (gcc) and the text editor emacs. A major legal
   innovation by Stallman was the GNU General Public Licence (GPL), a widely popular
   FLOSS software license. However, the GNU project was stymied in its efforts to
   develop the "kernel" of the operating system. The GNU project was following the
   advice of academics to use a "microkernel architecture," and was finding it
   difficult to develop a strong kernel using this architecture. Without a kernel,
   the GNU project could not fulfill their goal.

   Meanwhile, the University of California at Berkeley had had a long relationship
   with AT&T's Unix operating system, and Berkeley had ended up rewriting many Unix
   components. Keith Bostic solicited many people to rewrite the remaining key
   utilities from scratch, and eventually managed to create a nearly-complete system
   whose source code could be freely released to the public without restriction. The
   omissions were quickly filled, and soon a number of operating systems were
   developed based on this effort. Unfortunately, these operating systems were held
   under a cloud of concern from lawsuits and counter-lawsuits for a number of
   years. Another issue was that since the BSD licenses permitted companies to take
   the code and make it proprietary, companies such as Sun and BSDI did so -
   continuously siphoning developers from the openly sharable code, and often not
   contributing back to the publicly available code. Finally, the projects that
   developed these operating systems tended to be small groups of people who gained
   a reputation for rarely accepting the contributions by others (this reputation is
   unfair, but nevertheless the perception did become widespread). The descendants
   of this effort include the capable operating systems NetBSD, OpenBSD, and
   FreeBSD, as a group called the *BSDs. However, while they are both used and
   respected, and proprietary variants of these (such as Apple Mac OS X) are
   thriving, another FLOSS effort quickly gained the limelight and much more market
   share.

   In 1991, Linus Torvalds began developing a small operating system kernel called
   "Linux", at first primarily for learning about the Intel 80386 chip. Unlike the
   BSD efforts, Torvalds eventually settled on the GPL license, which forced
   competing companies working on the kernel code to work together. Advocates of the
   *BSDs dispute that this is an advantage, but even today, major Linux
   distributions hire key kernel developers to work together on common code, in
   contrast to the corresponding commercial companies to the *BSDs which often do
   not share their improvements to a common program. Torvalds made a number of
   design decisions that in retrospect were remarkably wise: using a traditional
   monolithic kernel design (instead of the "microkernel approach" that slowed the
   GNU project), using the the Intel 386 line as the primary focus, working to
   support user requests (such as "dual booting"), and supporting hardware that was
   technically poor but widely used. And finally, Torvalds stumbled into a
   development process rather different from traditional approaches by exploiting
   the Internet. Torvalds' new process looked rather different than more traditional
   approaches. He publicly released new versions extremely often (sometimes more
   than once a day, allowing quick identification when regressions occurred), and he
   quickly delegated areas to a large group of developers (instead of sticking to a
   very small number of developers). Instead of depending on rigid standards, rapid
   feedback on small increments and Darwinian competition were used to increase
   quality.

   When the Linux kernel was combined with the already-developed GNU operating
   system components and some components from other places (such as from the BSD
   systems), the resulting operating system was surprisingly stable and capable.
   Such systems were called GNU/Linux systems or simply Linux systems. Note that
   there is a common misconception in the media that needs to be countered here:
   Linus Torvalds never developed the so-called "Linux operating system". Torvalds
   was the lead developer of the Linux kernel, but the kernel is only one of many
   pieces of an operating system; most of the GNU/Linux operating system was
   developed by the GNU project and by other related projects.

   In 1996, Eric Raymond realized that Torvalds had stumbled upon a whole new style
   of development, combining the sharing possibilities of FLOSS with the speed of
   the Internet into a new development process. His essay [1120]The Cathedral and
   the Bazaar identifies that process, in a way that others could try to emulate the
   approach. The essay was highly influential, and in particular convinced Netscape
   to switch to an FLOSS approach for its next generation web browser (the road for
   Netscape was bumpy, but ultimately successful).

   In spring of 1997, a group of leaders in the Free Software community gathered,
   including Eric Raymond, Tim O'Reilly, and Larry Wall. They were concerned that
   the term "Free Software" was too confusing and unhelpful (for example, many
   incorrectly thought that the issue was having no cost). The group coined the term
   "open source" as an alternative term, and Bruce Perens developed the initial
   version of the "open source definition" to define the term. The term "open
   source" is now very widely used, but not universally so; Richard Stallman (head
   of the FSF) never accepted it, and even Bruce Perens switched back to using the
   term "Free Software" because Perens felt that there needed to be more emphasis on
   user freedom.

   Major Unix server applications (such as the FLOSS Apache web server) were easily
   moved to GNU/Linux or the *BSDs, since they all essentially implemented the POSIX
   standards. As a result, GNU/Linux and the *BSDs rapidly gained significant market
   share in the server market. A number of major initiatives began to fill in gaps
   to create completely FLOSS modern operating systems, including graphical
   toolkits, desktop environments, and major desktop applications. In 2002, the
   first user-ready versions of capable and critical desktop applications (Mozilla
   for web browsing and OpenOffice.org for an office suite) were announced.

   You can learn more about the history of FLOSS from material such as [1121]Open
   Sources: Voices from the Open Source Revolution and Free for All: How Linux and
   the Free Software Movement Undercut the High-Tech Titans by Peter Wayner,

A.4 Licenses

   There are dozens of FLOSS licenses, but the vast majority of FLOSS software uses
   one of the four major licenses: the GNU General Public License (GPL), the GNU
   Lesser (or Library) General Public License (LGPL), the MIT (aka X11) license, and
   the BSD-new license. Indeed the Open Source Initiative refers to these four
   licenses as the [1122]classic open source licenses. The GPL and LGPL are termed
   "copylefting" licenses ([1123] also called "protective" licenses), that is, these
   licenses are designed to prevent (protect) the code from becoming proprietary.

   Here is a short description of these licenses:
    1. The GPL allows anyone to use the program and modify it, but prevents code
       from becoming proprietary once distributed and it also forbids proprietary
       programs from "linking" to it.
    2. The MIT and BSD-new licenses let anyone do almost anything with the code
       except sue the authors. One minor complication: there are actually two "BSD"
       licenses, sometimes called "BSD-old" and "BSD-new"; new programs should use
       BSD-new instead of BSD-old.
    3. The LGPL is a compromise between the GPL and the MIT/BSD-new approaches, and
       was originally intended for code libraries. Like the GPL, LGPL-licensed
       software cannot be changed and made proprietary, but the LGPL does permit
       proprietary programs to link to the library, like the MIT/BSD-new licenses.

   Note that all of these licenses (the GPL, MIT, BSD-new, and LGPL) permit the
   commercial sale and the commercial use of the software, and many such programs as
   sold and used that way. See [1124]Perens' paper for more information comparing
   these licenses.

   The most popular FLOSS license by far is the GPL. For example, Freshmeat.net
   reported on April 4, 2002 that 71.85% of the 25,286 software branches (packages)
   it tracked are GPL-licensed (the next two most popular were LGPL, 4.47%, and the
   BSD licenses, 4.17%). Sourceforge.net reported on April 4, 2002 that the GPL
   accounted for 73% of the 23,651 "open source" projects it hosted (next most
   popular were the LGPL, 10%, and the BSD licenses, 7%). In my paper [1125]More
   than a Gigabuck: Estimating GNU/Linux's Size, I found that Red Hat Linux, one of
   the most popular GNU/Linux distributions, had over 30 million physical source
   lines of code in version 7.1, and that 50.36% of the lines of code were licensed
   solely under the GPL (the next most common were the MIT license, 8.28%, and the
   LGPL, 7.64%). If you consider the lines that are dual licensed (licensed under
   both the GPL and another license, allowing users and developers to pick the
   license to use), the total lines of code under the GPL accounts for 55.3% of the
   total. [1126]My paper on GPL compatibility discusses these figures further, and
   discusses why, if you choose to develop FLOSS code, you should strongly consider
   using a licensing approach that is compatible with the GPL.

   There are whole books about software licensing in general, or FLOSS licensing in
   particular, if you wish to delve into this topic in depth. One book about FLOSS
   licensing is [1127]Understanding Open Source and Free Software Licensing by
   Andrew M. St. Laurent.

A.5 Management Approaches

   There is no single approach to managing an FLOSS project, just as there is no
   single approach to managing proprietary projects. Management approaches are
   strongly influenced by the size and scope of the project, as well as the
   leadership styles of those managing the project.

   [1128]The Cathedral and the Bazaar argues for a particular style of development,
   termed the "bazaar" style. In this approach, there are a large number of small,
   incremental releases, and a large number of developers can send in patches for
   proposed improvements. The releases need to compile and run (to some extent), so
   that developers can test and improve them. Not all FLOSS projects work this way,
   but many do.

   It is useful to examine the management approaches of successful projects to
   identify approaches that may work elsewhere. Here are a few:
    1. Linux kernel. The Linux kernel's development process is based on a hierarchy
       of four levels: ordinary developers, maintainers, trusted lieutenants, and
       the benevolent dictator. Ordinary developers can propose changes, but usually
       they submit their proposals to a maintainer of a particular component of the
       kernel; the maintainers then send their sets up to a trusted lieutenants, who
       then sends it up to the benevolent dictator (currently Linus Torvalds). At
       each stage testing can take place. The benevolent dictator writes code and
       issues general direction, but his primary job is to be the integrator and
       arbiter of changes. In the past (and probably again in the future) there were
       two branches, the "stable" and "development" branches, where occasionally the
       development branch would become the new stable branch. As of the end of 2004,
       the kernel development process has been modified so that someone else (Andrew
       Morton at the end of 2004) manages a "development" stage, and once Torvalds
       determines a specific change is ready, it's brought into the main stable
       version. Linux distributions then take the stable branch, test it further,
       and select the "best" version of the stable branch. [1129]BusinessWeek named
       Linus Torvalds as one of the best managers of 2004.
    2. Apache. The Apache web server project, in contrast, is run by a group. At the
       top is the "Apache HTTP Server Project Management Committee (PMC)" a group of
       volunteers who are responsible for managing the Apache HTTP Server Project.
       Membership in the Apache PMC is by invitation only and must be approved by
       consensus of the active Apache PMC members. Membership can be revoked by a
       unanimous vote of all the active PMC members other than the member in
       question. Most changes are approved by consensus.
       An action item requiring consensus approval must receive at least 3 binding
       +1 votes and no vetos (a "-1" vote). An action item requiring majority
       approval must receive at least 3 binding +1 votes and more +1 votes than -1
       votes (i.e., a majority with a minimum quorum of three positive votes).
       Ideas must be review-then-commit; patches can be commit-then-review. With a
       commit-then-review process, they trust that the developer doing the commit
       has a high degree of confidence in the change. Doubtful changes, new
       features, and large-scale overhauls need to be discussed before being
       committed to a repository.
       See the [1130]Apache Voting Rules for more detailed information.
    3. Perl. Perl was originally developed by Larry Wall, but he no longer wishes to
       have to always have the job of integrating patches. Thus, there is a notional
       "patch pumpkin" that must be acquired to change Perl. In Moody's Rebel Code,
       Wall explains that "we have essentially a chief integrator who is called the
       pumpkin holder". Moody adds that this "integration involves taking the
       approved patches and adding them into the main Perl source code". Larry Wall,
       as original developer, can veto any change. [1131]More information about the
       patch pumpkin (as it has currently evolved) is available from perl.com.
    4. Sourceforge-based Applications. Many FLOSS projects are supported by
       SourceForge, which includes the CVS tool for configuration management.
       Typically, those who have write access to the repository simply make their
       updates; others who do not have such access post their requests or patches to
       the bug tracking database (or mailing list) and ask one of those with write
       access to include it. There are typically only a few people with direct write
       access, so conflicts are rare and CVS supports resolving the occasional
       conflict.

   Successful FLOSS projects generally have a large number of contributors. A small
   proportion of the contributors write a majority of the code, but the value of the
   rest should not be underestimated; the fact that many others are reviewing the
   system, to identify or fix special bugs, enables the other developers to be more
   productive (because someone else, who looks at the project in a different way,
   can find or fix a bug faster, relieving the majority developers to do other
   things).

   Large groups can be surprisingly effective at converging to good answers. An
   interesting analysis of this concept in general is given in "The Wisdom of
   Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes
   Business, Economies, Societies and Nations" by James Surowiecki. [1132]Groklaw
   reviewed this book.

A.6 Forking

   A fork is a competing project based on a version of the pre-existing project's
   source code. All FLOSS projects can be "forked"; the ability to create a fork is
   fundamental to the definition of FLOSS.

   Simply creating or releasing a variant of a project's code does not normally
   create a fork unless there's an intent to create a competing project. Indeed,
   releasing variants for experimentation is considered normal in a typical FLOSS
   development process. Many FLOSS projects (such as the Linux kernel development
   project) intentionally have "fly-offs" (also called "bake-offs") where different
   developers implement different competing approaches; the results are compared and
   the approach that produces the best results (the "winner") is accepted by the
   project. These "fly-offs" are often discussed in evolutionary terms, e.g., the
   "winning mutation" is accepted into the project and the alternatives are
   abandoned as "evolutionary dead ends". Since all parties intend for the "best"
   approach to accepted by the project, and for the other approaches to be
   abandoned, these are not forks.

   What is different about a fork is intent. In a fork, the person(s) creating the
   fork intend for the fork to replace or compete with the original project they are
   forking.

   Creating a fork is a major and emotional event in the FLOSS community. It similar
   to a call for a "vote of no confidence" in a parliament, or a call for a labor
   strike in a labor dispute. Those creating the fork are essentially stating that
   they believe the project's current leadership is ineffective, and are asking
   developers to vote against the project leadership by abandoning the original
   project and switching to their fork. Those who are creating the fork must argue
   why other developers should support their fork; common reasons given include a
   belief that changes are not being accepted fast enough, that changes are
   happening too quickly for users to absorb them, that the project governance is
   too closed to outsiders, that the licensing approach is hampering development, or
   that the project's technical direction is fundamentally incorrect.

   Most attempts to create forks are ignored, for there must be a strong reason for
   developers to consider switching to a competing project. Developers usually
   resist supporting FLOSS forks: they divide effort that would be more effective
   when combined, they make support and further development more difficult, and they
   require developers to discuss project governance rather than improving the
   project's products. Developers can attempt to support both projects, but this is
   usually impractical over time as the projects diverge. Eric Raymond, in
   Homesteading the Noosphere, argues that a prime motivation in FLOSS development
   is reputation gain through the use of a gift culture, and that forking
   significantly interferes with this motivation.

   There are four different possible outcomes of a fork attempt (ignoring "both
   fail"), and all of them have occurred in the history of FLOSS. These outcomes,
   along with historical examples, are:
    1. The death of the fork (example: libc/glibc). This is by far the most common
       outcome; indeed, many forks never receive enough support to "die".
    2. A re-merging of the fork with the original (example: gcc/egcs). This is where
       the projects rejoin each other (though one or the other is typically the
       dominant source of the combined effort).
    3. The death of the original (example: XFree86 gets replaced by X.org, Mambo
       gets replaces by Joomla).
    4. Successful branching -- both succeed, typically catering to different
       communities (examples: GNU emacs / xemacs, OpenBSD).

   Here is more information about these examples:
    1. glibc vs. libc. When the Linux kernel was first being developed, the kernel
       developers took the FSF's GNU C library (now called glibc) and created their
       own fork of it (called libc). Both were licensed under the LGPL. At the time,
       the Linux kernel developers thought that the FSF's development process for
       the C library was too slow and not responding to their needs. Thus, they
       [1133]created a forked version of GNU libc version 1.07.4 (which had been
       released February 17, 1994). In this case, however, the original GNU C
       library project (led by the FSF) surpassed the forked project over time. Over
       the next few years the original glibc increasingly offered far better
       standards conformance, multi-threading, higher performance, and more features
       than the forked libc project. [1134]Elliot Lee briefly describes this
       history. In this case, the fork was abandoned after several years; in 1997
       through 1998 nearly all GNU/Linux systems switched from libc back to glibc.
    2. gcc vs. egcs. The GNU Compiler Collection (gcc) is a collection of important
       compilers, including a C++ compiler; the main compilers are licensed under
       the GPL. In 1997, there were disagreements over the development approach and
       slow development speed of gcc. In particular, many were dissatisfied with the
       FSF-appointed gcc maintainer, who was very slow to accept changes. Cygnus
       (headed by Michael Tiemann) decided to create a fork of the project named
       egcs, and invited others to join. Egcs worked at an accelerated pace, and
       soon surpassed the original gcc project. In April 1999 the rift was healed;
       the FSF agreed to switch to using the egcs code for gcc, and the egcs project
       agreed to dissolve itself and take over the original gcc project. In this
       case, the fork ended with the forking project's results "taking over" the
       original project.
    3. Free86 vs. X.org. The XFree86 project historically led development of a
       popular X server. An X server is a critical component for implementing a
       graphical user interface in a typical Unix-like system. The XFree86 project
       traditionally licensed the vast majority of its code used the simple "MIT/X"
       open source license that is [1135]GPL-compatible. The XFree86 president,
       David Dawes, decided to change the XFree86 license to one that wasn't
       GPL-compatible and had many practical problems. This proposed license change
       caused a serious uproar, but the project leader refused to listen to those
       complaints. For example, [1136]Jim Gettys, a well-respected developer and
       co-founder of X, strongly opposed this change to the XFree86 license, even
       though he's not a strong advocate of the GPL. [1137]Richard Stallman politely
       asked that something be worked out. But the project leader wouldn't budge, so
       the users and some of the developers forked the project, creating a new
       project at X.org based on the previous version. An article at [1138]Linux
       Today and a [1139]discussion at Freedesktop.org show that the leading
       distributors, including Red Hat, Debian, SuSE, Gentoo, Mandrake, and OpenBSD,
       are switching or plan to switch from XFree86 to X.org. Since the XFree86
       folks wouldn't switch to a GPL-compatible license, [1140]the X.Org Foundation
       (formed January 2004) announced its own version of X on April 6, 2004. The
       X.Org foundation version was immediately endorsed by Novell's SUSE, Red Hat,
       HP, TrollTech, and FSF Europe among others. Very soon, nearly all developers
       and users had abandoned XFree86. You can see more information in my
       [1141]cautionary tale about XFree86. This is a case where a project leader
       attempted to make an extremely unpopular licensing change, causing a mass
       exodus of its uses and developers. Note how similar this process was to a
       vote of no confidence; the leader was unwilling to listen to his customers
       and developers, so his customers and developers established a project where
       their needs would be met.
    4. Mambo/Joomla As explained in [1142]Nathan Willis' "In Memoriam: the free
       software projects we lost in 2010", "The PHP-based content management system
       (CMS) Mambo suffered an acrimonious leadership battle in 2005 that led to the
       departure of the bulk of the developers, who started the Joomla CMS. As is
       often the case is such a fork, the remaining owners of the Mambo trademark
       and source code copyrights asserted that nothing was wrong and that
       development would continue unabated. Although that may have been true for a
       while, here at the end of 2010 it has been a full calendar year since there
       were any signs of life from Mambo (longer still since there was a release),
       apart from the occasional Twitter alert that the project's servers had been
       attacked. Joomla, on the other hand, seems fine".
    5. GNU emacs / xemacs, OpenBSD. Sometimes, though this is rare, a fork produces
       two projects which both live on; typically each fork ends up catering to a
       different community. GNU emacs was forked into xemacs, resulting in two
       successful projects. OpenBSD was originally the result of a fork, but it then
       specialized into being an operating system in which "security was more
       important than almost anything else", resulting in a very successful project
       that has not eliminated the other BSDs. Some people, such as Norm Petry,
       describe this kind of forking in evolutionary terms: "this type of fork is
       analogous to speciation, where each resulting species succeeds by filling its
       own, distinct ecological niche".

   Too many forks can be a serious problem for all of the related projects. In fact,
   one of the main reasons that Unix systems lost significant market share compared
   to Windows was because of the excessive number of Unix forks. Bob Young states
   this quite clearly in this essay "Giving it Away", and also suggests why this is
   unlikely to be a problem in copylefted FLOSS software:

     The primary difference between [GNU/Linux and Unix] is that Unix is just
     another proprietary binary-only ... OS [operating system]. The problem with a
     proprietary binary-only OS that is available from multiple suppliers is that
     those suppliers have short-term marketing pressures to keep whatever
     innovations they make to the OS to themselves for the benefit of their
     customers exclusively. Over time these "proprietary innovations" to each
     version of the Unix OS cause the various Unixes to differ substantially from
     each other. This occurs when the other vendors do not have access to the
     source code of the innovation and the license the Unix vendors use prohibit
     the use of that innovation even if everyone else involved in Unix wanted to
     use the same innovation. In Linux the pressures are the reverse. If one Linux
     supplier adopts an innovation that becomes popular in the market, the other
     Linux vendors will immediately adopt that innovation. This is because they
     have access to the source code of that innovation and it comes under a license
     that allows them to use it.

   Note that the copylefting licenses (such as the GPL and LGPL) permit forks, but
   greatly reduce any monetary incentive to create a fork. Thus, the project's
   software licensing approach impacts the likelihood of its forking.

   The ability to create a fork is important in FLOSS development, for the same
   reason that the ability to call for a vote of no confidence or a labor strike is
   important. Fundamentally, the ability to create a fork forces project leaders to
   pay attention to their constituencies. Even if an FLOSS project completely
   dominates its market niche, there is always a potential competitor to that
   project: a fork of the project. Often, the threat of a fork is enough to cause
   project leaders to pay attention to some issues they had ignored before, should
   those issues actually be important. In the end, forking is an escape valve that
   allows those who are dissatisfied with the project's current leadership to show
   whether or not their alternative is better.
     ____________________________________________________________________________


                                   About the Author

   [1143]David A. Wheeler is an expert in computer security and has a long history
   of working with large and high-risk software systems. His books include Software
   Inspection: An Industry Best Practice (published by IEEE CS Press), Ada 95: The
   Lovelace Tutorial (published by Springer-Verlag), and the [1144]Secure
   Programming for Linux and Unix HOWTO (on how to create secure software). Articles
   he's written related to FLOSS include [1145]More than a Gigabuck: Estimating
   GNU/Linux's Size, [1146]How to Evaluate Open Source Software / Free Software
   (FLOSS) Programs, [1147]Comments on Open Source Software / Free Software (FLOSS)
   Software Configuration Management (SCM) systems, [1148]Make Your Open Source
   Software GPL-Compatible. Or Else, and [1149]FLOSS References. Other
   security-related articles he's written include [1150]Securing Microsoft Windows
   (for Home and Small Business Users), [1151]Software Configuration Management
   (SCM) Security, and [1152]Countering Spam Using Email Passwords. Other articles
   he's written include [1153]The Most Important Software Innovations, [1154]Stop
   Spam!, and an article on [1155]Fischer Random Chess (Chess960). He has released
   software as well, including [1156]flawfinder (a source code scanner for
   developing secure software by detecting vulnerabilities) and [1157]SLOCCount (a
   program to measure source lines of code, aka SLOC). Mr. Wheeler's web site is at
   [1158]http://dwheeler.com. You may contact him using the information at
   [1159]http://dwheeler.com/contactme.html but you may not send him spam (he
   reserves the right to charge fees to those who send him spam). Picture of David
   A. Wheeler

   [1160]Valid HTML 4.01! Please link to this article at
   http://dwheeler.com/oss_fs_why.html. You may reprint this article (unchanged) an
   unlimited number of times and distribute local electronic copies (e.g., inside an
   organization or at a conference/presentation), as long as the article is provided
   free of charge to the recipient(s). You may also quote this article, as long as
   the quote is clearly identified as a quote and you attribute your quote with the
   article title, URL, and my name (be sure to use my middle initial, "A".). You may
   not "mirror" a copy of this document to the public Internet or other public
   electronic distribution systems; mirrors interfere with ensuring that readers can
   immediately find and get the current version of this document. Copies clearly
   identified as old versions, not included in normal searches as current Internet
   data, and for which there is no charge (direct or indirect) for those allowed
   access are generally fine; examples of acceptable copies are Google caches and
   the Internet archive's copies. [1161]Please contact me if you know of missing
   information, see something that needs fixing (such as a misspelling or
   grammatical error), or would like to translate this article to another human
   language. Translators: I would love to see more freely-available translations of
   this document, and I will help you coordinate with others who may be translating
   the document into that language. Trademarks are registered by various
   organizations, for example, Linux(r) is a trademark of Linus Torvalds. This is a
   personal essay and not endorsed by my employer; many people have found it useful,
   though. This article is a research article, not software nor a software manual.

References

   1. http://www.spoxdesign.com/web/overview/technology/proc_oss.html
   2. http://www.sil-cetril.org/wheeler/traduction-fr.html
   3. http://oss.mri.co.jp/reports/wheeler/oss_fs_why.html
   4. http://www.hispalinux.es/informes/wheeler/index.html
   5. http://dwheeler.com/contactme.html
   6. http://dwheeler.com/numbers-database/
   7. https://dwheeler.com/oss_fs_refs.html
   8. https://dwheeler.com/oss_fs_why.html#popularity
   9. https://dwheeler.com/oss_fs_why.html#reliability
  10. https://dwheeler.com/oss_fs_why.html#performance
  11. https://dwheeler.com/oss_fs_why.html#scalability
  12. https://dwheeler.com/oss_fs_why.html#security
  13. https://dwheeler.com/oss_fs_why.html#tco
  14. https://dwheeler.com/oss_fs_why.html#non_quantitative
  15. https://dwheeler.com/oss_fs_why.html#fears
  16. https://dwheeler.com/oss_fs_why.html#desktop
  17. https://dwheeler.com/oss_fs_why.html#usereports
  18. https://dwheeler.com/oss_fs_why.html#governments
  19. https://dwheeler.com/oss_fs_why.html#other
  20. https://dwheeler.com/oss_fs_why.html#conclusions
  21. https://dwheeler.com/oss_fs_why.html#appendix
  22. http://dwheeler.com/oss_fs_why.html
  23. http://dwheeler.com/numbers/
  24. http://www.plkr.org/
  25. http://dwheeler.com/archive
  26. http://dwheeler.com/archive/ChangeLog
  27. https://dwheeler.com/oss_fs_refs.html
  28. http://www.forrester.com/Research/Document/Excerpt/0,7211,38866,00.html
  29. http://www.report.cpr.ca.gov/cprrpt/issrec/stops/it/so10.htm
  30. http://www.canopener.ca/article.php?story=31&mode=print
  31. https://dwheeler.com/oss_fs_why.html#scope
  32. https://dwheeler.com/oss_fs_why.html#challenges
  33. https://dwheeler.com/oss_fs_why.html#terminology
  34. https://dwheeler.com/commonsproduction
  35. https://dwheeler.com/oss_fs_why.html#organization
  36. https://dwheeler.com/oss_fs_why.html#popularity
  37. https://dwheeler.com/oss_fs_why.html#reliability
  38. https://dwheeler.com/oss_fs_why.html#performance
  39. https://dwheeler.com/oss_fs_why.html#scalability
  40. https://dwheeler.com/oss_fs_why.html#security
  41. https://dwheeler.com/oss_fs_why.html#tco
  42. http://dwheeler.com/
  43. http://dwheeler.com/numbers/
  44. http://dwheeler.com/numbers-database/
  45. https://groups.google.com/forum/#!forum/numbers-about-free-libre-open-source-software
  46. http://www.gnu.org/philosophy/philosophy.html
  47. http://dwheeler.com/oss_fs_eval.html
  48. http://europa.eu.int/idabc/en/document/2623#migration
  49. http://www.kbst.bund.de/Anlage303777/pdf_datei.pdf
  50. http://www.computerworld.com/softwaretopics/os/linux/story/0,10801,80194,00.html
  51. http://dwheeler.com/oss_fs_refs.html#linux-vs-gnu-linux
  52. http://httpd.apache.org/
  53. http://www.mozilla.org/
  54. http://www.openoffice.org/
  55. https://dwheeler.com/oss_fs_why.html#microsoft-sells-gpl
  56. http://www.opensource.apple.com/
  57. http://weblog.infoworld.com/foster/2004/10/30.html
  58. http://www.ucita.com/
  59. http://www.infoworld.com/articles/op/xml/00/01/24/000124opfoster.html
  60. http://www.cl.cam.ac.uk/netos/papers/2003-xensosp.pdf
  61. http://web.archive.org/web/20051127031507/http://insight.zdnet.co.uk/software/0,39020463,39238437,00.htm
  62. http://web.archive.org/web/20031202165024/http://www.eweek.com/article2/0,4149,1314541,00.asp
  63. http://seattlepi.nwsource.com/business/158237_msftresearch27.html
  64. http://antitrust.slated.org/www.iowaconsumercase.org/011607/3000/PX03096.pdf
  65. http://www.forrester.com/Info/0,1503,355,00.html
  66. http://www.theinquirer.net/?article=18067
  67. http://www.asa.org.uk/adjudications/show_adjudication.asp?adjudication_id=38475&from_index=show_advertisers&dates_of_adjudications_id=578
  68. http://www.businessweek.com/the_thread/techbeat/archives/2005/04/the_truth_about_1.html
  69. http://www.groklaw.net/article.php?story=20050419175709648
  70. https://dwheeler.com/oss_fs_why.html#appendix
  71. https://dwheeler.com/oss_fs_refs.html
  72. http://www.opensource.org/docs/definition.html
  73. http://www.gnu.org/philosophy/free-sw.html
  74. http://dwheeler.com/sloc/
  75. http://dwheeler.com/essays/gpl-compatible.html
  76. http://www.catb.org/~esr/jargon/html/writing-style.html
  77. http://perens.com/Articles/StandTogether.html
  78. http://www.benkler.org/CoasesPenguin.html
  79. http://www.wikipedia.org/
  80. http://creativecommons.org/
  81. http://search.yahoo.com/cc
  82. http://www.demos.co.uk/WideOpen_pdf_media_public.aspx
  83. http://www.freedom-to-tinker.com/
  84. http://www.cl.cam.ac.uk/users/rja14/tcpa-faq.html
  85. http://news.bbc.co.uk/2/hi/technology/4360793.stm
  86. http://codebook.jot.com/Book
  87. https://dwheeler.com/oss_fs_why.html#popularity
  88. https://dwheeler.com/oss_fs_why.html#reliability
  89. https://dwheeler.com/oss_fs_why.html#performance
  90. https://dwheeler.com/oss_fs_why.html#scalability
  91. https://dwheeler.com/oss_fs_why.html#security
  92. https://dwheeler.com/oss_fs_why.html#tco
  93. https://dwheeler.com/oss_fs_why.html#non_quantitative
  94. https://dwheeler.com/oss_fs_why.html#fears
  95. https://dwheeler.com/oss_fs_why.html#desktop
  96. https://dwheeler.com/oss_fs_why.html#usereports
  97. https://dwheeler.com/oss_fs_why.html#other
  98. https://dwheeler.com/oss_fs_why.html#conclusions
  99. https://dwheeler.com/oss_fs_why.html#appendix
 100. https://dwheeler.com/oss_fs_why.html#single-source
 101. https://dwheeler.com/oss_fs_why.html#licensing-litigation
 102. https://dwheeler.com/oss_fs_why.html#greater-flexibility
 103. https://dwheeler.com/oss_fs_why.html#social-moral
 104. https://dwheeler.com/oss_fs_why.html#innovation
 105. https://dwheeler.com/oss_fs_why.html#ossfs-better-supported
 106. https://dwheeler.com/oss_fs_why.html#ossfs-gives-more-legal-rights
 107. https://dwheeler.com/oss_fs_why.html#ossfs-pirated
 108. https://dwheeler.com/oss_fs_why.html#ossfs-protects-from-abandonment
 109. https://dwheeler.com/oss_fs_why.html#enforceable
 110. https://dwheeler.com/oss_fs_why.html#gpl-force-unproprietary
 111. https://dwheeler.com/oss_fs_why.html#ossfs-economically-viable
 112. https://dwheeler.com/oss_fs_why.html#wont-destroy-industry
 113. https://dwheeler.com/oss_fs_why.html#commercialization
 114. https://dwheeler.com/oss_fs_why.html#ossfs-is-compatible-with-capitalism
 115. https://dwheeler.com/oss_fs_why.html#eliminate-competition
 116. https://dwheeler.com/oss_fs_why.html#ossfs-wont-destroy-ip
 117. https://dwheeler.com/oss_fs_why.html#lots_of_software
 118. https://dwheeler.com/oss_fs_why.html#source-access-is-important
 119. https://dwheeler.com/oss_fs_why.html#anti-microsoft
 120. https://dwheeler.com/oss_fs_why.html#contribute-with-code
 121. https://dwheeler.com/oss_fs_why.html#definitions
 122. https://dwheeler.com/oss_fs_why.html#motivation
 123. https://dwheeler.com/oss_fs_why.html#history
 124. https://dwheeler.com/oss_fs_why.html#licenses
 125. https://dwheeler.com/oss_fs_why.html#management
 126. https://dwheeler.com/oss_fs_why.html#forking
 127. http://survey.netcraft.com/
 128. http://news.netcraft.com/archives/2011/05/02/may-2011-web-server-survey.html
 129. http://news.netcraft.com/archives/2006/04/06/april_2006_web_server_survey.html
 130. http://business.newsforge.com/article.pl?sid=06/04/20/1652228
 131. http://news.netcraft.com/archives/2011/05/02/may-2011-web-server-survey.html
 132. https://ssl.netcraft.com/ssl-sample-report/
 133. http://news.com.com/2100-7344-5139511.html
 134. http://www.securityspace.com/s_survey
 135. http://www.securityspace.com/s_survey/data/200703/index.html
 136. http://www.securityspace.com/s_survey/sdata/200703/index.html
 137. http://www.securityspace.com/
 138. http://news.netcraft.com/archives/2007/04/04/open_source_parking_spoofing_headers_to_benefit_apache.html
 139. http://www.netcraft.com/survey
 140. http://www.securityspace.com/s_survey
 141. http://www.pcworld.com/news/article/0,aid,116848,00.asp
 142. https://dwheeler.com/oss_fs_why.html#ie-vulnerabilities
 143. http://www.mozilla.org/products/firefox/
 144. http://story.news.yahoo.com/news?tmpl=story&ncid=1817&e=2&u=/zd/20041101/tc_zd/138409&sid=96120751
 145. http://www.upsdell.com/BrowserNews/stat_trends.htm
 146. http://www.informationweek.com/story/showArticle.jhtml?articleID=159902316
 147. http://www.websidestory.com/services-solutions/datainsights/spotlight.html
 148. http://www.onestat.com/html/aboutus_pressbox36.html
 149. http://www.thecounter.com/stats/
 150. http://www.thecounter.com/stats/2005/February/browser.php
 151. http://www.thecounter.com/stats/2004/August/browser.php
 152. http://www.quotationspage.com/
 153. http://www.figby.com/archives/2005/02/28/browser-stats-the-state-of-firefox/
 154. http://www.e-janco.com/browser.htm
 155. http://www.zdnet.com.au/news/software/0,2000061733,39188309,00.htm
 156. http://www.techweb.com/wire/security/193104314
 157. http://marketshare.hitslink.com/report.aspx?qprid=3
 158. http://www.informationweek.com/news/showArticle.jhtml?articleID=196901142
 159. http://www.xitimonitor.com/etudes/equipement4.asp
 160. http://translate.google.com/translate?u=http%3A//www.xitimonitor.com/etudes/equipement4.asp&langpair=fr%7Cen&prev=/language_tools
 161. http://standblog.org/blog/2005/03/15/93114061-firefox-usage-in-europe-during-week-ends
 162. http://www.cia.gov/cia/publications/factbook/
 163. http://weblogs.mozillazine.org/asa/archives/006444.html
 164. http://www.w3schools.com/browsers/browsers_stats.asp
 165. http://news.com.com/Firefox+drawing+fans+away+from+Microsoft+IE/2100-1032_3-5368302.html?tag=nefd.top
 166. http://calacanis.weblogsinc.com/entry/5574794258282236/
 167. http://www.pcmag.com/article2/0,1759,1645327,00.asp
 168. http://www.informationweek.com/story/showArticle.jhtml?articleID=159908603&tid=5979
 169. http://arstechnica.com/news.ars/post/20050327-4738.html
 170. http://www.informationweek.com/story/showArticle.jhtml?articleID=159908603&tid=5979
 171. http://marketshare.hitslink.com/report.aspx?qprid=3
 172. http://www.computerworld.com/softwaretopics/software/apps/story/0,10801,110194,00.html?source=x4
 173. http://www.informationweek.com/story/showArticle.jhtml?articleID=159902316
 174. http://www.onestat.com/html/aboutus_pressbox44-mozilla-firefox-has-slightly-increased.html
 175. http://webtips.dan.info/brand-x/useragent.html
 176. http://spreadfirefox.com/
 177. http://www.inc.com/magazine/20070201/features-firefox.html
 178. http://en.wikipedia.org/w/index.php?title=Usage_share_of_web_browsers&oldid=430705550
 179. http://www.informationweek.com/news/smb/mobile/showArticle.jhtml?articleID=229000356
 180. http://www.readwriteweb.com/archives/android_market_share_numbers_questioned.php
 181. http://androidheadlines.com/2011/02/android-os-market-share-rises-to-27-tied-with-blackberry-os.html/smartphone-os-share1
 182. http://www.netcraft.com/
 183. http://www.netcraft.com/Survey/index-200106.html#computers
 184. http://www.netcraft.com/Survey/index-200109.html#computers
 185. http://uptime.netcraft.com/up/graph/?mode_u=on&mode_w=on&site=www.google.com
 186. http://uptime.netcraft.com/up/graph/?mode_u=on&mode_w=on&site=www.yahoo.com
 187. http://www.leb.net/hzo/ioscount
 188. http://www.idc.com/itforecaster/itf20000808.stm
 189. http://www.computer.org/computer/homepage/june/ind_trends/index.htm
 190. http://news.com.com/2100-1001-959049.html
 191. http://www.forbes.com/2002/07/15/0715linux.html
 192. http://www.it-director.com/article.php?id=2332
 193. http://www.computerworld.com.au/idg2.nsf/All/67D07652A34F7ABBCA256C6000761846!OpenDocument&NavArea=Home&SelectedCategoryName=News
 194. http://www.linuxdevices.com/articles/AT7342059167.html
 195. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.111301/213170209
 196. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.112602/223300066
 197. http://blogs.zdnet.com/open-source/?p=837
 198. http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9006990&intsrc=news_ts_head
 199. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.020904/240405311
 200. http://www.informationweek.com/story/showArticle.jhtml?articleID=51201599&tid=5979
 201. http://direct.ips.co.jp/book/Template/Goods/go_BookstempGR.cfm?GM_ID=1686&SPM_ID=1&HN_NO=00400
 202. http://www.infonomics.nl/FLOSS
 203. http://oss.mri.co.jp/
 204. http://www.infonomics.nl/FLOSS
 205. http://www.zdnet.com.au/news/0,39023165,20230848,00.htm
 206. http://www.theregister.co.uk/content/4/19662.html
 207. http://www.theregister.co.uk/content/4/19661.html
 208. http://www.zdnet.com/eweek/stories/general/0,11011,2651826,00.html
 209. https://dwheeler.com/idaya_linuxgrowth.pdf
 210. http://www.openforumeurope.org/research.php
 211. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.062701/211782585&ticker=IBM
 212. http://www.networkcomputing.com/1224/1224f1.html
 213. http://news.com.com/2100-1001-956496.html
 214. http://news.com.com/2009-1001-961354.html
 215. http://news.com.com/2100-1001-984010.html
 216. http://news.com.com/2100-1001-985769.html
 217. http://www.businessweek.com/magazine/content/05_05/b3918001_mz001.htm
 218. http://www.zdnet.com.au/newstech/os/story/0,2000024997,20261699,00.htm
 219. http://cr.yp.to/surveys/smtpsoftware6.txt
 220. http://openacs.org/about/licensing/open-source-licensing
 221. http://cr.yp.to/qmail/dist.html
 222. http://www.oreillynet.com/pub/a/sysadmin/2007/01/05/fingerprinting-mail-servers.html
 223. http://www.isi.edu/~bmanning/in-addr-versions.html
 224. http://www.idg.net.nz/webhome.nsf/NL/1B8AEC1796517F55CC256BF30015ADB4
 225. http://mydns.bboy.net/survey/
 226. http://lwn.net/Articles/1433
 227. http://www.securityspace.com/s_survey/data/man.200703/apachemods.html?mod=UEhQ
 228. http://www.openssh.org/usage/index.html
 229. http://dwheeler.com/frozen/ssh-stats.html
 230. http://dwheeler.com/frozen/ssh-stats-200410.html
 231. http://www.varbusiness.com/sections/technology/tech.asp?ArticleID=44410
 232. http://zdnet.com.com/2100-1104_2-5134836.html?tag=tu.swblog.6566
 233. http://searchenterpriselinux.techtarget.com/originalContent/0,289142,sid39_gci1011227,00.html
 234. http://oetrends.com/news.php?action=view_record&idnum=392
 235. http://www.businessweek.com/technology/content/oct2005/tc20050103_7038_tc_218.htm
 236. http://i.cmpnet.com/infoweek/1057/IWKLinuxOutlook-2005.pdf
 237. http://linuxdevices.com/articles/AT7065740528.html
 238. http://www.businessweek.com/technology/content/dec2005/tc20051228_262746.htm
 239. http://www.linux-watch.com/news/NS8445673704.html
 240. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2913225,00.html
 241. http://www.cs.wisc.edu/~bart/fuzz/fuzz.html
 242. https://dwheeler.com/fuzz-failure-rate.png
 243. http://www.cs.wisc.edu/~bart/fuzz/fuzz.html
 244. http://home.pacbell.net/s-max/scott/bulletproof-penguin.html
 245. http://www-106.ibm.com/developerworks/linux/library/l-rel/
 246. http://web.archive.org/web/20010606035231/http://www.zdnet.com/sp/stories/issue/0,4537,2387282,00.html
 247. http://gnet.dhs.org/stories/bloor.php3
 248. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.021103/230420300
 249. http://news.com.com/2100-1001-985221.html?tag=fd_top
 250. http://www.reasoning.com/newsevents/pr/12_15_03.html
 251. http://news.com.com/Security+research+suggests+Linux+has+fewer+flaws/2100-1002_3-5489804.html
 252. http://informationweek.com/story/showArticle.jhtml?articleID=167100724
 253. http://web.archive.org/web/20011011215009/http://www.syscontrol.ch/e/news/Serversoftware.html
 254. http://web.archive.org/web/20010421142135/http://www.syscontrol.ch/e/SWePIX/SWePIXe.html
 255. http://babelfish.altavista.com/translate.dyn?doit=done&lp=de_en&bbltype=urltext&url=http://web.archive.org/web/20010421142135/http://www.syscontrol.ch/e/SWePIX/SWePIXe.html
 256. http://news.netcraft.com/archives/2004/06/02/most_reliable_hosting_providers_during_may.html
 257. http://uptime.netcraft.com/
 258. https://dwheeler.com/frozen/top.avg.2001aug3.html
 259. http://uptime.netcraft.com/up/accuracy.html
 260. http://www.acmqueue.com/modules.php?name=Content&pa=showpage&pid=240
 261. http://opensource.mit.edu/papers/maccormackrusnakbaldwin.pdf
 262. http://news.zdnet.co.uk/software/linuxunix/0,39020390,39190950,00.htm
 263. http://arxiv.org/abs/cond-mat/0306511
 264. http://web.archive.org/web/20010208190429/http://www.info-sec.com/OSsec/OSsec_080498g_j.shtml
 265. http://www.abiresearch.com/abiprdisplay2.jsp?pressid=384
 266. http://www.tpc.org/tpcc/results/tpcc_result_detail.asp?id=102091601
 267. http://www.tpc.org/tpcc/results/tpcc_result_detail.asp?id=102060501
 268. http://lwn.net/Articles/10857/
 269. http://searchenterpriselinux.techtarget.com/news/article/0,289142,sid39_gci1332534,00.html
 270. http://www.pcmag.com/article/0,2997,s%253D25068%2526a%253D16554,00.asp
 271. http://physics.nist.gov/cuu/Units/binary.html
 272. http://www.vnunet.com/News/1131114
 273. http://www.sysadminmag.com/articles/2001/0107/0107a/0107a.htm
 274. http://www.sysadminmag.com/
 275. http://www.sysadminmag.com/articles/2001/0108/0108q/0108q.htm
 276. http://www.spec.org/
 277. http://www-106.ibm.com/developerworks/linux/library/l-rt4/?open&t=grl,l=252,p=pipes
 278. http://www-106.ibm.com/developerworks/linux/library/l-rt7/?Open&t=grl,l=252,p=mgth
 279. http://www.eweek.com/article2/0,3959,293,00.asp
 280. http://www.mysql.com/information/benchmarks.html
 281. http://www.wired.com/news/infostructure/0,1377,57625,00.html
 282. http://www.osnews.com/story.php?news_id=4867
 283. http://www.anandtech.com/mac/showdoc.aspx?i=2520
 284. http://blogs.zdnet.com/Murphy/index.php?p=459
 285. http://www.zdnet.com/sp/stories/issue/0,4537,2196115,00.html
 286. http://www.mindcraft.com/whitepapers/nts4rhlinux.html
 287. http://lwn.net/1999/features/MindCraft1.0.phtml
 288. http://www.kegel.com/mindcraft_redux.html
 289. http://www.kegel.com/mindcraft_redux.html
 290. http://www.mindcraft.com/whitepapers/openbench1.html
 291. http://www.heise.de/ct/english/99/13/186-1
 292. http://www.networkcomputing.com/1011/1011f1.html
 293. https://dwheeler.com/oss_fs_why.html#tco
 294. http://www.zdnet.com/enterprise/stories/main/0,10228,2776383,00.html
 295. http://www.zdnet.com/enterprise/stories/main/0,10228,2776519,00.html
 296. http://www.kegel.com/nt-linux-benchmarks.html
 297. http://www.zdnet.com/zdnn/stories/news/0,4586,2760874,00.html
 298. http://www.tpc.org/tpch/results/h-ttperf.idc
 299. http://www.kegel.com/nt-linux-benchmarks.html
 300. http://www.spec.org/
 301. http://dmoz.org/Computers/Performance_and_Capacity/Benchmarking/
 302. http://www.forbes.com/home/enterprisetech/2005/03/15/cz_dl_0315linux.html
 303. http://www.top500.org/
 304. http://lwn.net/Articles/141397/
 305. http://hardware.newsforge.com/article.pl?sid=05/11/15/1443249
 306. http://www.tldp.org/HOWTO/Beowulf-HOWTO.html
 307. http://www.top500.org/list/2001/06/
 308. http://www.cs.sandia.gov/cplant
 309. http://www.linuxworld.com/story/44799.htm
 310. http://www.cbronline.com/article_news.asp?guid=A6E54915-E012-4B62-B7AE-382A4F670154
 311. http://www.archive.org/about/faqs.php
 312. http://www.archive.org/web/petabox.php
 313. http://www.linuxdevices.com/articles/AT8728350077.html
 314. http://www.agendacomputing.com/
 315. http://www.timesofindia.com/300301/30intw1.htm
 316. http://linas.org/linux/i370.html
 317. http://gb.lwn.net/2000/features/FSLCluster
 318. http://www.idg.net/crd_linux_473938_102.html
 319. http://www.freeos.com/articles/3800
 320. http://www.linux.org/projects/ports.html
 321. http://www.kroah.com/log/linux/ols_2006_keynote.html
 322. http://www.netbsd.org/
 323. http://dwheeler.com/sloc
 324. http://www.internetwk.com/story/INW20021028S0003
 325. http://portal.acm.org/citation.cfm?id=1188921
 326. http://www.cyber-rights.org/interception/echelon/European_parliament_resolution.htm
 327. http://cacm.acm.org/magazines/2011/5/107687-is-open-source-security-a-myth/abstract
 328. http://news.cnet.com/news/0-1003-200-6077282.html
 329. http://www.vnunet.com/News/1116081
 330. http://attrition.org/mirror/attrition/os-graphs.html
 331. http://news.cnet.com/news/0-1003-200-5994475.html?tag=rltdnws
 332. http://defaced.alldas.de/
 333. http://www.redhatmagazine.com/2007/04/18/risk-report-two-years-of-red-hat-enterprise-linux-4/
 334. http://news.com.com/Linux+lasting+longer+against+Net+attacks/2100-7349_3-5501278.html
 335. http://www.vnunet.com/news/1160588
 336. http://www.honeynet.org/papers/index.html
 337. http://www.avantgarde.com/xxxxttln.pdf
 338. http://www.usatoday.com/money/industries/technology/2004-11-29-honeypot_x.htm
 339. http://isc.sans.org/survivalhistory.php
 340. http://www.infoworld.com/d/mobilize/vista-breached-linux-remains-unbeaten-in-hacking-contest-752
 341. http://www.securityfocus.com/cgi-gin/vulns.pl
 342. http://abcnews.go.com/sections/tech/FredMoody/moody000802.html
 343. http://web.archive.org/web/20010617174123/http://www.securityfocus.com/templates/forum_message.html?forum=2&head=2782&id=2782
 344. http://www.vnunet.com/News/1128907
 345. http://www.vnunet.com/News/1135481
 346. http://lwn.net/Articles/10962/
 347. http://www.counterpane.com/crypto-gram-0009.html
 348. http://web.archive.org/web/20010608142954/http://securityportal.com/cover/coverstory20000117.html
 349. http://www.eweek.com/article2/0,3959,562226,00.asp
 350. http://www.onlinesecurity.com/links/links835.php
 351. http://www.vnunet.com/analysis/1157431
 352. http://www.eweek.com/article2/0,,1637079,00.asp
 353. http://www.computerworld.com.au/index.php/id;906285078;fp;2;fpid;1
 354. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.040802/220982285
 355. http://www.zdnet.com/eweek/stories/general/0,11011,2792860,00.html
 356. http://www.zdnet.com/eweek/stories/general/0,11011,2792859,00.html
 357. http://www.microsoft.com/technet/treeview/default.asp?url=/technet/itsolutions/security/current.asp?productid=17&servicepackid=0
 358. http://www.cert.org/advisories/CA-2001-19.html
 359. http://www.eweek.com/article/0,3658,s=1884&a=25302,00.asp
 360. http://blogs.zdnet.com/security/?p=316
 361. http://security.itworld.com/4347/070622vistapatching/page_1.html
 362. http://www.cnn.com/2002/TECH/internet/02/25/2002.security.idg/index.html
 363. http://news.cnet.com/news/0-1003-201-7239473-0.html?tag=nbs
 364. http://www.gartner.com/DisplayDocument?id=336339
 365. http://googleonlinesecurity.blogspot.com/2007/06/web-server-software-and-malware.html
 366. https://dwheeler.com/frozen/cert_current_activity.html
 367. http://icat.nist.gov/
 368. http://www.theregister.co.uk/security/security_report_windows_vs_linux/
 369. http://www.theregister.co.uk/2004/10/22/security_report_windows_vs_linux.pdf
 370. http://www.govtalk.gov.uk/interoperability/egif_document.asp?docnum=430
 371. http://www.vnunet.com/news/1135763
 372. http://www.betanews.com/article/Microsoft_Closes_Activation_Loophole/1109293194
 373. http://www.roaringpenguin.com/mimedefang/anti-virus.php3
 374. http://home.businesswire.com/portal/site/google/index.jsp?ndmViewId=news_view&newsId=20040728005158&newsLang=en
 375. http://www.theregister.co.uk/2004/06/04/trojan_spam_study/
 376. http://www.sandvine.com/solutions/pdfs/spam_trojan_trend_analysis.pdf
 377. http://www.staysafeonline.info/press/060403.pdf
 378. http://securitytracker.com/learn/securitytracker-stats-2002.pdf
 379. http://www.internetnews.com/security/article.php/3374931
 380. http://story.news.yahoo.com/news?tmpl=story&cid=74&e=3&u=/cmp/20040702/tc_cmp/22103407
 381. http://zdnet.com.com/2100-1105_2-5256297.html
 382. http://netsecurity.about.com/gi/dynamic/offsite.htm?site=http://www.computerworld.com/securitytopics/security/holes/story/0%2C10801%2C94366%2C00.html%3Ff=x584
 383. http://zdnet.com.com/2100-1105_2-5250003.html
 384. http://isc.sans.org/diary.php?date=2004-06-25
 385. http://www.wired.com/news/infostructure/0,1377,64065,00.html
 386. http://www.kb.cert.org/vuls/id/323070
 387. http://enterprisesecurity.symantec.com/content.cfm?articleid=1539
 388. http://www.usatoday.com/tech/news/2004-07-01-cyber-threat_x.htm
 389. http://www.nytimes.com/2004/08/12/technology/circuits/12brow.html?ex=1250049600&en=2e6b25eafd7f2db7&ei=5090&partner=rssuserland
 390. http://www.theinquirer.net/?article=16922
 391. http://news.netcraft.com/archives/2004/07/05/browser_wars_to_recommence.html
 392. https://dwheeler.com/oss_fs_why.html#browser-marketshare
 393. http://software.newsforge.com/article.pl?sid=04/07/08/2327246&mode=nested
 394. http://www.mozilla.org/security/shell.html
 395. http://bugzilla.mozilla.org/show_bug.cgi?id=250180
 396. http://www.infoworld.com/article/04/07/12/HNmicromozilla_1.html
 397. http://www.mozillazine.org/talkback.html?article=4997
 398. http://secunia.com/advisories/12048/
 399. http://dwheeler.com/essays/securing-windows.html
 400. http://enterprisesecurity.symantec.com/content.cfm?articleid=1539
 401. http://news.com.com/Symantec+Mozilla+browsers+more+vulnerable+than+IE/2100-1002_3-5873273.html
 402. http://www.zdnet.co.uk/print/?TYPE=story&AT=39219186-39020375t-10000025c
 403. http://nanobox.chipx86.com/ie_is_dangerous.php
 404. http://bcheck.scanit.be/bcheck/page.php?name=STATS2004
 405. http://bcheck.scanit.be/bcheck/
 406. http://blog.washingtonpost.com/securityfix/2007/01/internet_explorer_unsafe_for_2.html
 407. http://news.yahoo.com/s/cmp/20060210/tc_cmp/179102616
 408. http://blogs.washingtonpost.com/securityfix/2006/02/a_time_to_patch.html
 409. http://blogs.washingtonpost.com/securityfix/2006/01/a_timeline_of_m.html
 410. http://www.heinz.cmu.edu/%7Ertelang/disclosure_jan_06.pdf
 411. http://www.networkcomputing.com/1201/1201f1b1.html
 412. http://www.blackwell-synergy.com/links/doi/10.1046/j.1365-2575.2002.00118.x/abs/
 413. http://lwn.net/Articles/131788/
 414. http://www.mi2g.com/cgi/mi2g/frameset.php?pageid=http%3A//www.mi2g.com/cgi/mi2g/press/021104.php
 415. http://www.theregister.co.uk/2002/11/21/why_is_mi2g_so_unpopular/
 416. http://archives.neohapsis.com/archives/fulldisclosure/2004-07/0820.html
 417. http://www.attrition.org/errata/charlatan/mi2g-history.html
 418. http://vmyths.com/resource.cfm?id=64amp;&page=1
 419. http://story.news.yahoo.com/news?tmpl=story2&u=/cmp/20041106/tc_cmp/52200183
 420. http://www.acsac.org/2001/abstracts/thu-1530-b-anderson.html
 421. http://web.archive.org/web/20011218022817/http://www.newsbytes.com/news/01/173039.html
 422. http://www.newsbytes.com/about/index.html
 423. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2859555,00.html
 424. http://www.acm.org/classics/sep95/
 425. http://dwheeler.com/trusting-trust/
 426. http://www.gcn.com/vol1_no1/daily-updates/35119-1.html
 427. http://niap.nist.gov/cc-scheme/vpl/vpl_type.html
 428. http://www.itsecurity.com/tecsnews/aug2004/aug45.htm
 429. http://www.redhat.com/f/pdf/gov/cceval_roadmap.pdf
 430. http://www.redhat.com/f/pdf/gov/cceval_roadmap.pdf
 431. http://www.gcn.com/24_2/news/34873-1.html
 432. http://www.mandrakesoft.com/company/press/pr?n=/pr/corporate/2509
 433. http://www.nbnn.com/vol1_no1/new_products-technology/27359-1.html
 434. http://csrc.nist.gov/cryptval/140-1/1401val2004.htm
 435. http://oss-institute.org/fips-faq.html
 436. http://answers.google.com/answers/threadview?id=304308
 437. http://www.adti.net/html_files/defense/opensource_debate.html
 438. http://www.wired.com/news/linux/0,1411,52973,00.html
 439. http://lwn.net/Articles/1864/
 440. http://www.upi.com/view.cfm?StoryID=20021227-105113-4829r
 441. http://www.smokefreeforhealth.org/studies/YachBialous.htm
 442. http://www.cs.vu.nl/~ast/brown/
 443. http://dwheeler.com/oss_fs_why.html#microsoft-sells-gpl
 444. http://dwheeler.com/essays/gpl-compatible.html
 445. http://archive.ncsa.uiuc.edu/SDG/Software/Mosaic/License/LicenseInfo.html
 446. http://dwheeler.com/sloc
 447. http://www.cpi.seas.gwu.edu/oss/cpi_rebuttal.pdf
 448. http://www.juliao.org/pub/adti-comments.pdf
 449. http://www.theregister.co.uk/content/4/25659.html
 450. http://lwn.net/Articles/1789/
 451. http://plug.linux.org.au/~leonb/wide-open-source-debate.html
 452. http://www.linuxandmain.com/modules.php?name=News&file=article&sid=97
 453. http://www.cs.vu.nl/~ast/brown/
 454. http://xroads.virginia.edu/~HYPER/DETOC/ch2_05.htm
 455. http://www.atstake.com/research/reports/index.html#opensource_forensics
 456. http://os.newsforge.com/os/04/05/18/1715247.shtml
 457. http://zdnet.com.com/2100-1107-980938.html
 458. http://www.securityfocus.com/commentary/19
 459. http://dev-opensourceit.earthweb.com/news/000526_security.html
 460. http://www.esecurityplanet.com/views/article.php/3665801
 461. http://bsdnexus.com/library/tc0.pdf
 462. https://dwheeler.com/oss_fs_why.html#tco
 463. http://people.redhat.com/mjc/20050505-fc4
 464. http://lwn.net/Articles/139541/
 465. http://www.eweek.com/article2/0,3959,5264,00.asp
 466. http://story.news.yahoo.com/news?tmpl=story&ncid=581&e=1&cid=581&u=/nm/20030331/tc_nm/tech_microsoft_security_dc
 467. http://dwheeler.com/essays/securing-windows.html
 468. http://dwheeler.com/essays/high-assurance-floss.html
 469. http://news.zdnet.com/2100-1009_22-5436305.html?tag=zdfd.newsfeed
 470. http://news.com.com/Students+uncover+dozens+of+Unix+software+flaws/2100-1002_3-5492969.html
 471. http://dwheeler.com/secure-programs/Secure-Programs-HOWTO/open-source-security.html
 472. http://dwheeler.com/secure-programs
 473. http://www.sun.com/servers/workgroup/tco/metastudy.html
 474. http://www.vnunet.com/Analysis/85833
 475. http://www.xephon.com/
 476. http://people.redhat.com/~hp/stateless/StatelessLinux.pdf
 477. http://www.csc.com/features/2004/uploads/LEF_OPENSOURCE.pdf
 478. http://www.murdoch.edu.au/elaw/issues/v10n4/halbert104_text.html
 479. http://www.cl.cam.ac.uk/users/rja14/tcpa-faq.html
 480. http://www.onlamp.com/pub/a/onlamp/2005/07/21/software_pricing.html
 481. http://www.infoworld.com/articles/hn/xml/02/10/01/021001hnbizopen.xml?s=IDGNS
 482. http://www.linux.org/dist/index.html
 483. http://www.osv.org.au/index.cgi?tid=162
 484. http://www.cyber.com.au/
 485. http://consultingtimes.com/Serverheist.html
 486. http://www.jimmo.com/Linux-NT_Debate/Cost_Comparison.html
 487. http://osopinion.com/perl/story/9849.html
 488. http://www.groklaw.net/article.php?story=20050106075631519
 489. http://asyd.net/docs/misc/comparing_the_gpl_to_eula.pdf
 490. https://www.eff.org/wp/dangerous-terms-users-guide-eulas
 491. https://dwheeler.com/oss_fs_why.html#licensing-litigation
 492. http://www.microsoft.com/windows2000/server/evaluation/sysreqs/default.asp
 493. http://www.sciam.com/2001/0801issue/0801hargrove.html
 494. http://www.linuxworld.com/site-stories/2001/0823.xterminal.html
 495. http://www.linuxworld.com/site-stories/2002/0403.tco.html
 496. http://www.linuxjournal.com//article.php?sid=7788
 497. http://flosspols.org/deliverables/FLOSSPOLS-D03%20local%20governments%20survey%20reportFINAL.pdf
 498. http://www.groklaw.net/article.php?story=20060119080108568
 499. http://www.nwfusion.com/news/2001/0319specialfocus.html
 500. http://www.vnunet.com/Analysis/1155278
 501. http://www.rfgonline.com/subsforum/LinuxTCO.pdf
 502. http://www.epractice.eu/files/media/media_479.pdf
 503. http://www.groklaw.net/article.php?story=20070112025016466
 504. http://www.cybersource.com.au/about/linux_vs_windows_tco_comparison.pdf
 505. http://www.theage.com.au/news/Breaking/TCO-study-Linux-wins-again/2004/12/13/1102786990788.html
 506. http://www.cyber.com.au/cyber/about/linux_vs_windows_tco_comparison.pdf
 507. http://www.cyber.com.au/cyber/about/linux_vs_windows_pricing_comparison.pdf
 508. http://www.linuxjournal.com/article.php?sid=6057&mode=thread&order=0
 509. http://www.suffritti.it/informatica/comparazione_TCO_win_linux.htm
 510. http://babel.altavista.com/?urltext=http%3A%2F%2Fwww.suffritti.it%2Finformatica%2Fcomparazione_TCO_win_linux.htm&lp=it_en
 511. http://www.itnews.com.au/newsstory.aspx?CIaNID=42505&src=site-marq
 512. http://www.rfgonline.com/subsforum/LinuxTCO.pdf
 513. http://www-1.ibm.com/linux/whitepapers/robertFrancesGroupLinuxTCOAnalysis05.pdf
 514. http://www.netproject.com/opensource/coo.html
 515. http://www.gartner.com/DisplayDocument?id=406459
 516. http://www.thalix.com/files/EMA_Levanta-Linux_RR.pdf
 517. http://www.theopenenterprise.com/story/TOE20020926S0002
 518. http://www.researchandmarkets.com/reports/c8216/
 519. http://www.tes.co.uk/2094985
 520. http://news.zdnet.co.uk/software/linuxunix/0,39020390,39196487,00.htm
 521. http://management.silicon.com/government/0,39024677,39129956,00.htm
 522. http://www.egovmonitor.com/node/695
 523. http://www.schoolforge.org.uk/index.php/Becta_14/4/05
 524. http://web.archive.org/web/20011201023315/www.robval.com/linux/desktop/index.html
 525. http://www.zdnet.com/zdnn/stories/news/0,4586,5098955,00.html
 526. http://news.cnet.com/news/0-1003-200-7720536.html?tag=owv
 527. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2860180,00.html
 528. http://www.dellhost.com/solutions/dedicated/d2800.asp
 529. http://www.teknologiraadet.dk/subpage.php3?article=851&survey=1&language=dk&front=1
 530. https://dwheeler.com/oss_fs_why.html#usereports
 531. http://www.varbusiness.com/sections/News/breakingnews.asp?ArticleID=36355
 532. http://www.theregister.co.uk/content/4/26230.html
 533. http://www.infoworld.com/articles/hn/xml/02/12/02/021202hnlinuxcosts.xml?s=IDGNS
 534. http://www.microsoft.com/windows2000/docs/TCO.pdf
 535. http://www.linuxworld.com/site-stories/2002/1219.barr.html
 536. http://download.microsoft.com/download/7/3/e/73e77129-db34-4c95-b182-ab0b9bd50081/TEICaseStudy.pdf
 537. http://comment.cio.com/soundoff/091103.html
 538. http://i.cmpnet.com/infoweek/1057/IWKLinuxOutlook-2005.pdf
 539. http://www.linux.com/article.pl?sid=06/12/04/1538214
 540. http://freshmeat.net/projects/evergreen/
 541. http://www.linux.com/feature/145797
 542. https://dwheeler.com/oss_fs_why.html#mitre-business-case
 543. http://www.computerworld.com/developmenttopics/development/webservices/story/0,10801,102638,00.html
 544. http://www.members.optushome.com.au/brendanscott/papers/freesoftwaretco150702.html
 545. http://fud-counter.nl.linux.org/tech/TCO.html
 546. http://fud-counter.nl.linux.org/tech/TCO2.html
 547. http://www.sun.co.uk/consolidation/pdf/linuxworld-reprint.pdf
 548. http://www.theregister.co.uk/2004/09/09/ms_capgemini_newham_report/
 549. http://trends.newsforge.com/trends/04/11/03/181215.shtml
 550. http://www.sims.berkeley.edu/~hal/Papers/2004/linux-adoption-in-the-public-sector.pdf
 551. http://www.computerworld.com/softwaretopics/os/story/0,10801,83708,00.html
 552. http://www.relevantive.de/Linux.html
 553. https://dwheeler.com/oss_fs_why.html#single-source
 554. https://dwheeler.com/oss_fs_why.html#licensing-litigation
 555. https://dwheeler.com/oss_fs_why.html#greater-flexibility
 556. https://dwheeler.com/oss_fs_why.html#social-moral
 557. https://dwheeler.com/oss_fs_why.html#innovation
 558. http://www.sunbelt-software.com/survey_02mar.cfm
 559. http://www.clendons.co.nz/microsoft_complaint.htm
 560. http://www.hevanet.com/peace/microsoft.htm
 561. http://www.antipatterns.com/vendorlockin.htm
 562. http://www.antipatterns.com/briefing/index.htm
 563. http://ars.userfriendly.org/cartoons/?id=20070901&mode=classic
 564. http://newsforge.com/newsforge/02/09/20/1543210.shtml?tid=23
 565. http://www.licenturion.com/xp/fully-licensed-wpa.txt
 566. http://news.com.com/2100-1001-244052.html?legacy=cnet&tag=st.ne.1002.bgif.ni
 567. http://www.sfgate.com/cgi-bin/article.cgi?file=/gate/archive/2002/02/07/bsa.DTL
 568. http://www.aaxnet.com/topics/slicense.html
 569. http://www.itworld.com/Man/2685/lw-12-vcontrol_2
 570. http://www.temple.edu/temple_times/3-23-00/set.html
 571. http://story.news.yahoo.com/news?tmpl=story&ncid=582&e=1&cid=582&u=/nm/20021029/wr_nm/retail_kmart_bluelight_dc
 572. http://www.worldtrademag.com/CDA/ArticleInformation/coverstory/BNPCoverStoryItem/0,3481,76659,00.html
 573. http://www.guardian.co.uk/online/story/0,3605,267330,00.html
 574. http://www.theregister.co.uk/2004/07/22/mewngofnodi
 575. http://www.itworldcanada.com/Pages/Docbase/ViewArticle.aspx?id=idgml-8f87ddb3-bfe0-4b69&s=90323
 576. http://www.gnu.org/philosophy/philosophy.html
 577. http://www.informationweek.com/story/showArticle.jhtml?articleID=51201599&tid=5979
 578. http://i.cmpnet.com/infoweek/1057/IWKLinuxOutlook-2005.pdf
 579. http://opensource.mit.edu/papers/kogut1.pdf
 580. http://web.archive.org/web/20040610231701/www.osdn.com/bcg/
 581. http://www.fcw.com/article88470-04-04-05-Print
 582. http://www.catb.org/~esr/writings/cathedral-bazaar/
 583. http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ar01s07.html
 584. http://article.gmane.org/gmane.linux.hotplug.devel/7039
 585. http://article.gmane.org/gmane.linux.hotplug.devel/7070
 586. http://www.linux.com/article.pl?sid=06/07/24/164213
 587. http://dreamsongs.com/IHE/
 588. http://news.cnet.com/news/0-1014-201-8155733-0.html
 589. http://www.mozilla.org/products/firefox/
 590. http://simon.incutio.com/archive/2004/09/14/liveBookmarks
 591. http://www.blakeross.com/archives/000220.html
 592. http://web.mit.edu/evhippel/www/
 593. http://www.cio.com/archive/101500/something.html
 594. http://opensource.mit.edu/papers/rp-vonhippelfranke.pdf
 595. http://opensource.mit.edu/papers/lakhanivonhippelusersupport.pdf
 596. http://opensource.mit.edu/papers/vonhippel3.pdf
 597. http://opensource.mit.edu/papers/hippelkrogh.pdf
 598. http://opensource.mit.edu/papers/evhippel-osuserinnovation.pdf
 599. http://opensource.mit.edu/papers/henkel.pdf
 600. http://opensource.mit.edu/papers/chanlee.pdf
 601. http://www.itworld.com/it-managementstrategy/187573/new-draw-open-source-innovation
 602. http://opensource.mit.edu/papers/lin2.pdf
 603. http://www.businesswire.com/cgi-bin/f_headline.cgi?bw.091404/242585281
 604. http://www.ced.org/docs/report/report_dcc.pdf
 605. http://linuxtoday.com/stories/8242.html
 606. http://www.newsforge.com/article.pl?sid=04/11/01/1927212
 607. http://www.koders.com/
 608. http://story.news.yahoo.com/news?tmpl=story&cid=581&e=1&u=/nm/20050402/tc_nm/column_pluggedin_dc
 609. http://www.youtube.com/watch?v=u6XAPnuFjJc&feature=related
 610. http://www.eweek.com/article2/0,3959,885490,00.asp
 611. http://www.opensource.org/halloween/halloween1.html
 612. http://dwheeler.com/innovation
 613. http://dwheeler.com/innovation/microsoft.html
 614. http://www.dyncorp-is.com/darpa/meetings/win98aug/wars.html
 615. http://story.news.yahoo.com/news?tmpl=story&cid=1093&e=6&u=/pcworld/20040824/tc_pcworld/117531
 616. http://news.cnet.com/news/0-1014-201-7921483-0.html
 617. http://www.wirednews.com/wired/archive/11.11/opensource.html
 618. http://www.redhat.com/
 619. http://www.novell.com/linux/suse/
 620. http://www.mandrakesoft.com/
 621. http://www.canonical.com/
 622. http://www.ubuntulinux.org/
 623. http://www.debian.org/consultants/
 624. http://www.openbsd.org/support.html
 625. http://www.networkcomputing.com/1309/1309f3.html
 626. http://support.decisionone.com/mozilla/mozilla_help_main.htm
 627. http://support.mozsource.com/a/news
 628. http://www.gnat.com/
 629. http://www.mysql.com/
 630. http://www.cluetrain.com/
 631. http://www.catb.org/~esr/faqs/smart-questions.html
 632. http://www.chiark.greenend.org.uk/~sgtatham/bugs.html
 633. http://news.com.com/Microsoft+walks+VB+tight+rope/2100-1007_3-5620821.html?tag=nefd.lede
 634. http://www.eweek.com/article2/0,1759,1655796,00.asp
 635. http://rblevin.blogspot.com/2005/03/microsoft-mvps-revolt.html
 636. http://pubs.logicalexpressions.com/Pub0009/LPMArticle.asp?ID=516
 637. http://fox.wikis.com/wc.dll?Wiki~VisualFred~SoftwareEng
 638. http://catb.org/~esr/jargon/html/V/Visual-Fred.html
 639. http://www.visual-expert.com/us/info/survey_vb_2004_results.htm
 640. http://classicvb.org/petition/
 641. http://www.codinghorror.com/blog/archives/000235.html
 642. http://www.eweek.com/article2/0,1759,1655796,00.asp
 643. http://classicvb.org/
 644. http://gendotnet.com/Blog/archive/2005/03/09/779.aspx
 645. http://blogs.zdnet.com/BTL/index.php?p=1141
 646. http://www.linuxjournal.com/article.php?sid=5073
 647. http://www.prnewswire.com/cgi-bin/stories.pl?ACCT=104&STORY=/www/story/12-15-2004/0002633825
 648. http://www.mass.gov/itd/legal/ninewaysprotectriskopensource.htm
 649. http://www.groklaw.net/article.php?story=2005011418070774
 650. http://www.businessweek.com/technology/content/feb2005/tc2005027_4780.htm
 651. http://www.llrx.com/features/opensource.htm
 652. http://www.groklaw.net/article.php?story=20040521100818411
 653. http://www.groklaw.net/article.php?story=20040419080041607
 654. http://www.groklaw.net/
 655. http://www.cs.vu.nl/~ast/brown/codecomparison/
 656. http://www.cs.vu.nl/~ast/brown
 657. http://www.cs.vu.nl/~ast/brown/followup/
 658. http://www.cs.vu.nl/~ast/brown/rebuttal/
 659. http://trends.newsforge.com/trends/04/05/24/2145237.shtml
 660. http://www.mlive.com/newsflash/business/index.ssf?/newsflash/get_story.ssf?/cgi-free/getstory_ssf.cgi?f0023_BC_WSJ--Portals&&news&newsflash-financial
 661. http://www.gimp.org/about/ancient_history.html
 662. http://apache.rcbowen.com/ApacheServer.html#Introduction_What_is_Apache
 663. http://www.gnu.org/philosophy/enforcing-gpl.html
 664. http://moglen.law.columbia.edu/publications/maine-speech.html
 665. http://news.com.com/2100-7344-5198117.html
 666. http://solutions.journaldunet.com/0405/040512_juridique.shtml
 667. http://www.jbb.de/html/?page=news&id=32
 668. http://www.heise.de/newsticker/meldung/49377
 669. http://www.groklaw.net/article.php?story=20050225223848129
 670. http://support.microsoft.com/default.aspx?scid=http://support.microsoft.com:80/support/kb/articles/Q306/8/19.ASP&NoWebContent=1#10
 671. http://www.gzip.org/zlib/apps.html
 672. http://www.osdllinuxsummit.org/presentations/tut3(Final)_Copenhaver_Reviewing%20Use%20of%20OSS%20in%20the%20Enterprise.pdf
 673. http://www.ht-technology.com/cherryos-pearpc/cherryos-pearpc.html
 674. http://lwn.net/Articles/73848/
 675. http://gpl-violations.org/
 676. http://gpl-violations.org/faq/violation-faq.html
 677. http://www.fsf.org/licensing/compliance.html
 678. http://www.gnu.org/licenses/gpl-faq.html
 679. http://www.catb.org/~esr/writings/magic-cauldron/
 680. http://www-106.ibm.com/developerworks/linux/library/license.html?dwzone=linux
 681. http://management.itmanagersjournal.com/management/04/05/10/2052216.shtml?tid=85
 682. http://perens.com/Articles/Economic.html
 683. http://www.siliconvalley.com/mld/siliconvalley/4996371.htm
 684. http://news.com.com/2100-1001-825723.html
 685. http://news.ft.com/cms/s/78d9812a-2386-11d9-aee5-00000e2511c8.html#U101244209021g4
 686. http://news.com.com/Firefox+fortune+hunters/2100-1032_3-5455173.html
 687. http://news.ft.com/servlet/ContentServer?pagename=FT.com/StoryFT/FullStory&c=StoryFT&cid=1042490975962&p=1012571727085
 688. http://www.joelonsoftware.com/articles/StrategyLetterV.html
 689. http://clustering.foundries.sourceforge.net/article.pl?sid=02/08/19/1426245
 690. http://www.mech.kuleuven.ac.be/~bruyninc/linux/economy-oss.html
 691. http://www.newscientist.com/hottopics/copyleft/copyleftart.jsp
 692. http://www.benkler.org/CoasesPenguin.html
 693. http://stephesblog.blogs.com/presentations/BrentWilliamsEclipseConV02.pdf
 694. http://www.linux.com/news/featured-blogs/158-jim-zemlin/464045-who-says-you-cant-make-money-with-open-source
 695. http://www.catb.org/~esr/writings/magic-cauldron/
 696. http://www.opensource.org/advocacy/jobs.php
 697. http://perens.com/Articles/Economic.html
 698. http://kb.mozillazine.org/index.phtml?title=Bounties
 699. http://www.gnome.org/bounties/
 700. http://www.horde.org/bounties/
 701. http://www.asterisk.org/
 702. http://www.limewire.org/wishlist.shtml
 703. http://www.i2p.net/bounties
 704. http://mantisbt.org/
 705. http://manual.mantisbt.org/manual.configuration.sponsorship.php
 706. http://www.plkr.org/
 707. https://bugzilla.mozilla.org/show_bug.cgi?id=124096
 708. http://www.markshuttleworth.com/bounty.html
 709. http://www.spi-inc.org/
 710. http://www.opensourcexperts.com/bountylist.html
 711. https://www.pubsoft.org/pubsoft.py/
 712. http://www.ideacradle.com/
 713. http://www.dropcash.com/
 714. http://www.mozilla.org/press/mozilla-2004-08-02.html
 715. http://www.fundable.org/
 716. http://business.newsforge.com/business/05/07/07/1330241.shtml
 717. http://www.blender3d.org/cms/History.53.0.html
 718. http://www.freeos.com/articles/4087
 719. http://clustering.foundries.sourceforge.net/article.pl?sid=02/08/19/1426245
 720. http://www.mozillazine.org/talkback.html?article=3976
 721. http://web.archive.org/web/20040610231701/www.osdn.com/bcg/
 722. http://gcn.com/vol1_no1/daily-updates/26641-1.html
 723. http://www.linuxfoundation.org/docs/lf_linux_kernel_development_2010.pdf
 724. http://www.msnbc.msn.com/id/5907194/
 725. http://www.oblomovka.com/entries/2004/07/29#1091150520
 726. http://www.businessweek.com/magazine/content/05_05/b3918001_mz001.htm
 727. http://mobile.newsforge.com/article.pl?sid=05/06/08/1948202&from=rss
 728. http://news.com.com/Firefox+fortune+hunters/2100-1032_3-5455173.html
 729. http://www.investors.com/editorial/IBDArticles.asp?artsec=16&issue=20050921
 730. http://searchvb.techtarget.com/originalContent/0,289142,sid8_gci1036918,00.html
 731. http://www.fsf.org/jobs
 732. http://www.crn.com/news/channel-programs/206900235/report-open-source-adoption-increases-app-dev-pay.htm
 733. http://en.wikipedia.org/wiki/X_Window_System
 734. http://httpd.apache.org/ABOUT_APACHE.html
 735. http://linuxinsider.com/story/39290.html
 736. http://www.neuralscape.com/cgi-bin/businessissues.cgi?disruptive_Linux.txt
 737. http://itmanagement.earthweb.com/career/article.php/3774811
 738. http://linuxtoday.com/news_story.php3?ltsn=2001-05-16-012-20-OP
 739. http://www.itworld.com/Man/2685/031208torvalds/
 740. https://dwheeler.com/oss_fs_why.html#ossfs-economically-viable
 741. https://dwheeler.com/oss_fs_why.html#wont-destroy-industry
 742. http://www.freedesktop.org/
 743. http://www.freestandards.org/
 744. http://www.freestandards.org/
 745. http://freedesktop.org/
 746. http://www.linuxbase.org/
 747. http://www.pathname.com/fhs/
 748. http://www.x.org/
 749. http://www.ietf.org/
 750. http://w3.org/
 751. https://dwheeler.com/oss_fs_why.html#single-source
 752. http://www.oreillynet.com/manila/tim/stories/storyReader$167
 753. http://slashdot.org/article.pl?sid=02/05/03/1654235&mode=thread
 754. http://web.archive.org/web/20011103204837/http://www.shared-source.org/index.html
 755. http://www.gnu.org/press/2001-05-04-GPL.html
 756. http://www.theregister.co.uk/content/4/19836.html
 757. https://dwheeler.com/oss_fs_why.html#microsoft-sells-gpl
 758. http://moglen.law.columbia.edu/publications/lu-12.html
 759. http://www.linuxdevices.com/articles/AT4739816141.html
 760. http://www.pcworld.com/news/article/0,aid,115547,00.asp
 761. http://www.onlamp.com/pub/a/onlamp/2005/03/24/shared_source.html
 762. http://news.zdnet.com/2100-3513_22-5384769.html
 763. http://dmoz.org/Computers/Software/Operating_Systems/Open_Source/
 764. http://zdnet.com.com/2100-11-520393.html?legacy=zdnn
 765. http://www.theregister.co.uk/2000/06/29/bill_gates_roots/
 766. https://dwheeler.com/oss_fs_why.html#greater-flexibility
 767. http://www.eweek.com/article2/0,1759,1859740,00.asp
 768. http://perens.com/Articles/StandTogether.html
 769. http://sourceforge.net/softwaremap/trove_list.php?form_cat=160
 770. http://sourceforge.net/softwaremap/trove_list.php?form_cat=199
 771. http://www.microsoft.com/windows/sfu/default.asp
 772. http://dwheeler.com/frozen/microsoft-interix-gpl.txt
 773. http://www.newsfactor.com/perl/story/11454.html
 774. http://www.thestandard.com/article/0,1902,27511,00.html
 775. http://www.groklaw.net/article.php?story=20070501092619462
 776. http://www.nytimes.com/2002/09/18/opinion/18WED2.html
 777. http://www.idc.com/itforecaster/itf20000808.stm
 778. http://www.itworld.com/Comp/2362/020917sunlinux/
 779. http://www.usdoj.gov/atr/cases/f3800/msjudgex.htm
 780. http://www.computerworld.com/governmenttopics/government/legalissues/story/0,10801,58278,00.html
 781. http://www.acmqueue.com/modules.php?name=Content&pa=showpage&pid=55
 782. http://www.sunbelt-software.com/survey_02mar.cfm
 783. http://www.infoworld.com/article/03/04/04/14gripe_1.html
 784. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2863077,00.html
 785. https://dwheeler.com/oss_fs_why.html#governments
 786. http://desktoplinux.com/articles/AT9664091996.html
 787. http://www.consultingtimes.com/Largo.html
 788. http://www.linuxdevcenter.com/pub/a/linux/2004/06/10/win4lin.html
 789. http://www.silicon.com/bin/bladerunner?30REQEVENT=&REQAUTH=21046&14001REQSUB=REQINT1=45449
 790. http://desktoplinux.com/
 791. http://www.osafoundation.org/desktop-linux-overview.pdf
 792. http://www.aceshardware.com/read.jsp?id=60000248
 793. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2878206,00.html
 794. http://69.56.255.194/?article=13350
 795. http://news.zdnet.co.uk/software/windows/0,39020396,39119059,00.htm
 796. http://defectivebydesign.org/
 797. http://defectivebydesign.org/
 798. http://catb.org/~esr/writings/world-domination/world-domination-201.html
 799. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2860180,00.html
 800. http://www.businessweek.com/technology/content/may2002/tc20020515_3723.htm
 801. http://news.cnet.com/news/0-1003-200-7720536.html?tag=owv
 802. http://www.computerworld.com/cwi/story/0,1199,NAV47_STO67867,00.html
 803. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2825019,00.html
 804. http://www.crn.com/Sections/CoverStory/CoverStory.asp?ArticleID=31793
 805. http://web.archive.org/web/20011201023315/www.robval.com/linux/desktop/index.html
 806. http://rudd-o.com/archives/2006/02/11/linux-to-windows-a-corporate-success-story/
 807. http://www.midmarket.eweek.com/c/a/News/Running-Only-on-Open-Source-Software/
 808. http://www.linuxplanet.com/linuxplanet/reports/4216/1
 809. http://www.linuxplanet.com/linuxplanet/reports/4306/
 810. http://edge-op.org/grouch/schools.html
 811. http://www.symonds.net/~fsug-kochi/mass-memo.html
 812. http://orange.math.buffalo.edu/csc/resolution2_april2003_approved.html
 813. http://www.netc.org/openoptions/
 814. http://www.internetnews.com/ent-news/article.php/3501561
 815. http://www.computerworld.com.au/index.php/id;101366230;fp;16;fpid;0
 816. http://news.com.com/2100-1017-827366.html
 817. http://www.forbes.com/home/2002/03/27/0327linux.html
 818. http://desktoplinux.com/articles/AT9664091996.html
 819. http://zdnet.com.com/2100-1104-887961.html
 820. http://siliconvalley.internet.com/news/article.php/1276851
 821. http://www.computerworld.com/softwaretopics/os/linux/story/0,10801,75271,00.html
 822. http://www.economist.com/agenda/displayStory.cfm?story_id=1338664
 823. http://zdnet.com.com/2100-1104-828802.html
 824. http://www.businessweek.com/technology/content/may2002/tc20020515_3938.htm
 825. http://www.reuters.com/news_article.jhtml?type=technologynews&StoryID=998076
 826. http://www.linuxjournal.com/article.php?sid=2494
 827. http://www.computer.org/computer/homepage/0202/ec/
 828. http://newsforge.com/newsforge/02/04/24/1643238.shtml?tid=23
 829. http://www.linuxjournal.com/article.php?sid=6011
 830. http://www.nytimes.com/2002/06/18/technology/18LINU.html?pagewanted=print&position=top
 831. http://www.linuxdevices.com/articles/AT4739871225.html
 832. http://www.computerworld.com/softwaretopics/os/linux/story/0,10801,75294,00.html
 833. http://www.hpworld.com/hpworldnews/hpw009/02nt.html
 834. http://public.yahoo.com/~radwin/talks/
 835. http://www.computerworld.com.au/idg2.nsf/All/2ADD84E6EBCEADE9CA256CB30075FA01!OpenDocument
 836. http://www.bloor-research.com/press.php?id=56
 837. http://newsforge.com/newsforge/02/06/17/1514234.shtml?tid=11
 838. http://www.wired.com/news/culture/0,1284,52669,00.html
 839. http://news.com.com/2009-1069-962579.html
 840. http://www.eweek.com/article2/0,3959,857638,00.asp
 841. http://www.infoworld.com/articles/hn/xml/02/06/10/020610hnopensource.xml
 842. http://www.idg.net/ic_874686_1794_9-10000.html
 843. http://li.org/
 844. http://li.org/success/
 845. http://www.mandrakebizcases.com/
 846. http://www.redhat.com/solutions/migration/#tools
 847. http://www.opensource.org/advocacy/case_studies.html
 848. http://www.dravis.net/pages/1/index.htm
 849. http://codebook.jot.com/Book
 850. http://www.pcworld.com/news/article/0,aid,95904,00.asp
 851. http://www.groklaw.net/article.php?story=20050215071109231
 852. http://www.techweb.com/wire/story/TWB19990906S0003
 853. http://www.adityanag.org/articles/ooo_interview.htm
 854. http://www.theregister.co.uk/2002/05/19/ms_in_peruvian_opensource_nightmare/
 855. http://www.csis.org/tech/it/#oss
 856. http://www.csis.org/tech/OpenSource/0408_ospolicies.pdf
 857. http://www.osaia.org/documents/OSAIA%20Policy%20Tracker%20v2.pdf
 858. http://www.infonomics.nl/FLOSS
 859. http://news.com.com/2100-1001-272299.html?legacy=cnet
 860. http://egovos.org/
 861. http://www.teknologiradet.no/html/592.htm
 862. http://www.aei.brookings.org/publications/abstract.php?pid=296
 863. http://www.linuxjournal.com/article/8449
 864. http://www.theregister.co.uk/2005/01/07/gov_open_source_dynamic/
 865. http://www.nytimes.com/2002/09/05/technology/05CODE.html
 866. http://newsforge.com/newsforge/02/10/20/1746231.shtml?tid=4
 867. http://newsforge.com/newsforge/03/04/30/1926238.shtml
 868. http://www.cptech.org/ecom/gp.html
 869. http://www.sims.berkeley.edu/~hal/Papers/2004/linux-adoption-in-the-public-sector.pdf
 870. http://www.computerworld.com/softwaretopics/os/story/0,10801,100494,00.html
 871. http://jakaplan.blogspot.com/2006/10/why-do-militaries-love-open-source.html
 872. http://www.linuxjournal.com/article/8037
 873. http://www.iosn.net/
 874. http://www.iosn.net/education/foss-education-primer
 875. http://europa.eu.int/idabc/en/document/2623#migration
 876. http://www.kbst.bund.de/Anlage303777/pdf_datei.pdf
 877. http://www.andwest.com/blojsom/blog/tatle/agenda/2005/06/27/Norwegian_Minister_Proprietary_Standards_No_Longer_Acceptable_in_Communication_with_Government.html
 878. http://danbricklin.com/log/2004_12_20.htm
 879. http://www.theregister.co.uk/content/archive/25157.html
 880. http://www.theregister.co.uk/content/4/26335.html
 881. http://www.kuro5hin.org/story/2002/6/3/55433/41738
 882. http://www.eud.com/2004/12/30/eco_art_30111A.shtml
 883. http://babelfish.altavista.com/babelfish/trurl_pagecontent?url=http%3A%2F%2Fwww.eud.com%2F2004%2F12%2F30%2Feco_art_30111A.shtml&lp=es_en
 884. http://www.iosn.net/l10n/foss-localization-primer/foss-localization-primer.pdf
 885. http://www.whitehouse.gov/omb/memoranda/fy04/m04-16.html
 886. http://www.egovos.org/rawmedia_repository/822a91d2_fc51_4e6e_8120_1c2d4d88fa06?/document.pdf
 887. http://www.whitehouse.gov/omb/egov/a-1-fea.html
 888. http://www.ccic.gov/ac/pitac_ltr_sep11.html
 889. https://dwheeler.com/oss_fs_why.html#mitre-business-case
 890. http://www.nas.nasa.gov/Research/Reports/Techreports/2003/nas-03-009-abstract.html
 891. http://ossim.org/documentation/ospr.html
 892. http://www.gocc.gov/
 893. https://www.core.gov/
 894. http://europa.eu.int/idabc/en/chapter/5649
 895. http://www.networkworld.com/news/2007/110207-survey-open-source-gaining-traction.html
 896. http://www.federalopensourcealliance.com/
 897. http://www.nas.nasa.gov/Resources/Software/Open-Source/opensource.html
 898. http://www.linuxinsider.com/story/Linux-Use-Drives-Innovation-126000XFH8B0.xhtml
 899. http://limnthis.typepad.com/limn_this/government_and_open_source/index.html
 900. http://skyscraper.fortunecity.com/mondo/841/documents/99-184.html
 901. http://www.netaction.org/opensrc/future
 902. http://www.netaction.org/opensrc/oss-report.html
 903. http://www.infoworld.com/article/03/02/12/HNrhat_1.html
 904. http://news.cnet.com/8301-13505_3-10037544-16.html?part=rss&tag=feed&subj=TheOpenRoad
 905. http://www.opensector.org/1065104758
 906. http://www.mass.gov/itd/openstandards.htm
 907. http://www.report.cpr.ca.gov/cprrpt/issrec/stops/it/so10.htm
 908. http://linuxjournal.com/article.php?sid=7827
 909. http://www.arb.ca.gov/Oss/oss.htm
 910. http://www.linuxjournal.com/node/7877
 911. http://www.adacore.com/home/
 912. http://expect.nist.gov/
 913. http://sourceforge.net/projects/expect
 914. https://dwheeler.com/www.osgeo.org
 915. http://www.linux.com/articles/58836
 916. https://www.giglite.org/
 917. http://w2cog.org/
 918. http://www.nas.nasa.gov/Resources/Software/Open-Source/opensource.html
 919. http://www.rvooz.org/
 920. http://limnthis.typepad.com/limn_this/2007/11/new-open-source.html
 921. http://www.delta3d.org/
 922. http://www.scs.org/pubs/jdms/vol3num3/JDMSIITSECvol3no3McDowell143-154.pdf
 923. http://limnthis.typepad.com/limn_this/2007/04/delta3d.html
 924. http://oss-institute.org/whitepapers/NCDG_Hamel_07-004.pdf
 925. http://www.acq.osd.mil/jctd/articles/OTDRoadmapFinal.pdf
 926. http://opentechdev.org/
 927. http://ec.europa.eu/enterprise/ict/policy/doc/2006-11-20-flossimpact.pdf
 928. http://europa.eu.int/ida/oso
 929. http://europa.eu.int/ida/en/chapter/469
 930. http://europa.eu.int/ida/en/chapter/470
 931. http://europa.eu.int/ida/en/chapter/472
 932. http://europa.eu.int/ISPO/ida/jsps/index.jsp?fuseAction=showDocument&parent=news&documentID=1647
 933. http://www.infodev.org/symp2003/publications/OpenSourceSoftware.pdf
 934. http://www.cospa-project.org/
 935. http://europa.eu.int/ispo/ida
 936. http://www.vnunet.com/News/1136433
 937. http://lmaugustin.typepad.com/lma/2008/09/commercial-open-source-in-europe-verses-the-us.html
 938. http://www.washingtonpost.com/wp-dyn/articles/A59197-2002Nov2.html
 939. http://news.bbc.co.uk/hi/english/business/newsid_2023000/2023127.stm
 940. http://news.bbc.co.uk/2/hi/business/2023127.stm
 941. http://www.computerworld.com/softwaretopics/os/linux/story/0,10801,81588,00.html
 942. http://www.usatoday.com/money/industries/technology/2003-07-13-microsoft-linux-munich_x.htm
 943. http://quote.bloomberg.com/apps/news?pid=10000085&sid=aYjHOozAjHAE
 944. http://lxer.com/module/newswire/view/77291/index.html
 945. http://www.heise.de/english/newsticker/news/55253
 946. http://linuxtoday.com/news_story.php3?ltsn=2002-06-17-011-26-NW-DP-PB
 947. http://www.statskontoret.se/pdf/200308eng.pdf
 948. http://www.statskontoret.se/op/
 949. http://lwn.net/Articles/13301
 950. http://www.tekno.dk/pdf/projekter/p03_opensource_paper_english.pdf
 951. http://www.ogc.gov.uk/index.asp?docid=2190#finalreport
 952. http://www.ogc.gov.uk/index.asp?docid=2190#finalreport
 953. http://www.theregister.co.uk/2004/10/28/ogc_oss_pilot_report/
 954. http://www.egovmonitor.com/node/319
 955. http://www.infoworld.com/article/05/04/06/HNukopensource_1.html
 956. http://www.npr.org/templates/story/story.php?storyId=4471963
 957. http://www.foxnews.com/story/0,2933,145827,00.html
 958. http://dwheeler.com/essays/fisl2005.html
 959. http://www.softex.br/cgi/cgilua.exe/sys/start.htm?infoid=5565&sid=37
 960. http://www.gnu.org.pe/proleyap.html
 961. http://www.oreillynet.com/cs/weblog/view/wlg/1364
 962. http://www.opensource.org/docs/peru_and_ms.php
 963. http://www.gnu.org.pe/resmseng.html
 964. http://www.theregister.co.uk/2002/05/19/ms_in_peruvian_opensource_nightmare/
 965. http://linuxtoday.com/news_story.php3?ltsn=2002-05-06-012-26-OS-SM-LL
 966. http://slashdot.org/article.pl?sid=02/05/04/220237&mode=thread&tid=117
 967. http://www.linuxjournal.com//article.php?sid=6049
 968. http://www.pcworld.com/news/article/0,aid,101879,00.asp
 969. http://www.theregister.co.uk/content/4/23667.html
 970. http://www.kuro5hin.org/story/2002/6/3/55433/41738
 971. http://news.zdnet.co.uk/software/linuxunix/0,39020390,39171012,00.htm
 972. http://www.bayarea.com/mld/mercurynews/7285339.htm
 973. http://www.bday.co.za/bday/content/direct/1,3523,1266306-6099-0,00.html
 974. http://allafrica.com/stories/200301160540.html
 975. http://www.tectonic.co.za/default.php?action=view&id=77
 976. http://www.tectonic.co.za/default.php?action=view&id=139
 977. http://news.com.com/2100-1012-996210.html
 978. http://news.bbc.co.uk/2/hi/technology/3090918.stm
 979. http://www.informationweek.com/story/showArticle.jhtml?articleID=17100349
 980. http://news.com.com/Indian+president+calls+for+open+source+in+defense/2100-7344_3-5259836.html
 981. http://64.233.179.104/search?q=cache:www.icta.lk/Insidepages/News%26event/040805whatsnew.asp
 982. http://www.members.optushome.com.au/brendanscott/papers/oslfossitdeficit040728.pdf
 983. http://www.sourceit.gov.au/__data/assets/pdf_file/42065/A_Guide_to_Open_Source_Software.pdf
 984. http://census.waughpartners.com.au/
 985. http://www.australianit.news.com.au/story/0,24897,23501577-15306,00.html
 986. http://cluecan.ca/
 987. http://www.vialibre.org.ar/lessdeveloped.html
 988. http://danny.oz.au/free-software/advocacy/appropriate.html
 989. http://www.maailma.kaapeli.fi/FLOSS_for_dev.html
 990. http://www.maailma.kaapeli.fi/FLOSS_for_dev.html
 991. http://newsforge.com/newsforge/02/10/24/1739254.shtml?tid=19
 992. http://www.washingtonpost.com/wp-dyn/articles/A10655-2002Oct24.html
 993. http://www.informationweek.com/story/IWK20021024S0001
 994. https://dwheeler.com/oss_fs_why.html#mitre-business-case
 995. http://www.softwarechoice.org/
 996. http://www.cio.com/archive/091504/microsoft.html/
 997. http://www.sincerechoice.org/
 998. http://www.theregister.co.uk/content/4/26616.html
 999. http://www.linuxjournal.com/article.php?sid=7872
1000. http://www.businessweek.com/magazine/content/04_45/b3907083_mz054.htm
1001. http://www.bloomberg.com/fgcgi.cgi?T=marketsquote99_relnews.ht&s=APiS8NBWeTWljcm9z
1002. http://www.fsf.org/
1003. http://www.opensource.org/
1004. http://www.linux.org/info/advocacy.html
1005. http://echo.gmu.edu/freeandopen/
1006. http://opensource.mit.edu/
1007. http://www.kirch.net/unix-nt
1008. http://web.archive.org/web/20010801155417/www.unix-vs-nt.org/kirch/
1009. http://www.catb.org/~esr/writings/cathedral-bazaar/
1010. http://www.csaszar.org/interesting/the_open_source_reader
1011. http://www.wayner.org/books/ffa
1012. http://www.osopinion.com/Opinions/GaneshCPrasad/GaneshCPrasad2.html
1013. http://www.kegel.com/linux/edu
1014. http://www.yoderdev.com/oss-future.html
1015. http://dwheeler.com/oss_fs_refs.html
1016. http://www.mitre.org/work/tech_papers/tech_papers_01/kenwood_software/index.html
1017. http://www.washingtonpost.com/wp-dyn/articles/A60050-2002May22.html
1018. http://www.egovos.org/rawmedia_repository/588347ad_c97c_48b9_a63d_821cb0e8422d?/document.pdf
1019. http://www.egovos.org/Resources
1020. http://www.infonomics.nl/FLOSS
1021. http://www.csc.com/features/2004/uploads/LEF_OPENSOURCE.pdf
1022. https://dwheeler.com/oss_fs_why.html#microsoft-sells-gpl
1023. http://www.linux.com/feature/149403
1024. http://news.com.com/2010-1078-855155.html
1025. http://www.osopinion.com/perl/story/?id=9748
1026. http://perens.com/Articles/StandTogether.html
1027. http://www.aaxnet.com/editor/edit029.html
1028. http://www.opensource.org/halloween
1029. http://opensource.org/halloween/halloween7.php
1030. http://remus.softimage.net/hotmail.html
1031. http://www.theregister.co.uk/content/4/28226.html
1032. http://lwn.net/1999/features/MSResponse.phtml
1033. http://dolinux.dyn.dhs.org/dolinux/docs/response.html
1034. http://fud-counter.nl.linux.org/
1035. http://web.archive.org/web/20020125002126/http://www.shared-source.com/
1036. http://www.gnu.org/philosophy/gpl-american-way.html
1037. http://perens.com/Articles/StandTogether.html
1038. http://www.geocities.com/cloweth
1039. http://news.com.com/2100-1001-870805.html
1040. http://news.com.com/2100-1001-872266.html
1041. http://news.com.com/2100-1001-874132.html
1042. http://www.theregister.co.uk/content/53/24714.html
1043. http://www.theregister.co.uk/content/53/24681.html
1044. http://lwn.net/2002/0411/letters.php3
1045. http://www.cbronline.com/article_news.asp?guid=47FA5398-FA1C-4CC2-B76E-9D4C3E1044C2
1046. http://www.salon.com/tech/fsp/2000/09/12/chapter_7_part_one/index.html
1047. http://www.ibm.com/annualreport/2000/flat/toc/2_3_1_intro.html
1048. http://news.com.com/2100-1001-825723.html
1049. http://news.com.com/2100-1001-822771.html
1050. http://techupdate.zdnet.com/techupdate/stories/main/0,14179,2860394,00.html
1051. http://www.consultingtimes.com/articles/ibm/frye/fryeinterview.html
1052. http://srom.zgp.org/
1053. http://wwwai.wu-wien.ac.at/~koch/forschung/sw-eng/oss_list.html
1054. http://greg.abstrakt.ch/docs/OSP_framework.pdf
1055. http://www.ibiblio.org/osrt/develpro.html
1056. http://www.psychologie.uni-kiel.de/linux-study
1057. http://widi.berlios.de/stats.php3
1058. http://www.infonomics.nl/FLOSS
1059. http://www.research.avayalabs.com/techreport/ALR-2002-003-paper.pdf
1060. http://web.archive.org/web/20040610231701/www.osdn.com/bcg/
1061. http://news.zdnet.co.uk/software/linuxunix/0,39020390,39186360,00.htm
1062. http://firstmonday.org/issues/issue9_1/bonaccorsi
1063. http://www.tldp.org/HOWTO/Software-Proj-Mgmt-HOWTO/index.html
1064. http://www.tldp.org/HOWTO/Software-Release-Practice-HOWTO/index.html
1065. http://www.catb.org/~esr/writings/cathedral-bazaar/
1066. http://www.gartnerweb.com/public/static/hotc/hc00091281.html
1067. http://gnet.dhs.org/stories/bloor.php3
1068. https://dwheeler.com/oss_fs_refs.html
1069. https://dwheeler.com/oss_fs_why.html#market_share
1070. https://dwheeler.com/oss_fs_why.html#reliability
1071. https://dwheeler.com/oss_fs_why.html#performance
1072. https://dwheeler.com/oss_fs_why.html#scaleability
1073. https://dwheeler.com/oss_fs_why.html#security
1074. https://dwheeler.com/oss_fs_why.html#tco
1075. https://dwheeler.com/oss_fs_why.html#non_quantitative
1076. http://www.computerworld.com/softwaretopics/os/linux/story/0,10801,80194,00.html
1077. http://dwheeler.com/oss_fs_eval.html
1078. http://dwheeler.com/gram.html
1079. http://www.openoffice.org/
1080. http://www.spreadfirefox.com/?q=affiliates&id=31988&t=60
1081. http://www.mozilla.org/products/thunderbird/
1082. http://theopendisc.com/
1083. http://www.theopencd.org/
1084. http://www.gnoppix.org/
1085. http://www.knopper.net/knoppix/index-en.html
1086. http://fedora.redhat.com/
1087. http://www.redhat.com/software/rhel/
1088. http://www.novell.com/linux/suse/
1089. http://www.mandrakesoft.com/
1090. http://www.ubuntulinux.org/
1091. http://www.linspire.com/
1092. http://www.debian.org/
1093. https://dwheeler.com/oss_fs_why.html#definitions
1094. https://dwheeler.com/oss_fs_why.html#motivation
1095. https://dwheeler.com/oss_fs_why.html#history
1096. https://dwheeler.com/oss_fs_why.html#licenses
1097. https://dwheeler.com/oss_fs_why.html#management
1098. https://dwheeler.com/oss_fs_why.html#forking
1099. http://www.gnu.org/philosophy/free-sw.html
1100. http://www.fsfla.org/?q=en/node/139#1
1101. http://www.opensource.org/docs/definition_plain.html
1102. http://www.cpi.seas.gwu.edu/oss/cpi_rebuttal.pdf
1103. http://www.debian.org/social_contract.html
1104. http://people.debian.org/~bap/dfsg-faq.html
1105. http://people.debian.org/~bap/dfsg-faq.html
1106. http://dwheeler.com/essays/gpl-compatible.html
1107. http://www.oreilly.com/catalog/opensources/book/perens.html
1108. http://danesecooper.blogs.com/divablog/2005/03/on_license_prol.html
1109. http://egofood.blogspot.com/2005/04/its-official-im-on-osi-board.html
1110. http://www.computerworld.com/governmenttopics/government/legalissues/story/0,10801,95091,00.html
1111. http://www.imc.org/ietf-mxcomp/mail-archive/msg03514.html
1112. http://www.gnu.org/philosophy/free-software-for-freedom.html
1113. http://www.infonomics.nl/FLOSS
1114. http://web.archive.org/web/20040610231701/www.osdn.com/bcg/
1115. http://www.infonomics.nl/FLOSS
1116. http://www.joelonsoftware.com/articles/StrategyLetterV.html
1117. http://www.catb.org/~esr/writings/magic-cauldron/
1118. http://www.infonomics.nl/FLOSS
1119. http://www.multicians.org/fjcc1.html
1120. http://www.catb.org/~esr/writings/cathedral-bazaar/
1121. http://www.oreilly.com/catalog/opensources/book/toc.html
1122. http://www.opensource.org/licenses/
1123. http://www.groklaw.net/article.php?story=20031231092027900
1124. http://www.oreilly.com/catalog/opensources/book/perens.html
1125. http://dwheeler.com/sloc
1126. http://dwheeler.com/essays/gpl-compatible.html
1127. http://www.oreilly.com/catalog/osfreesoft/book/
1128. http://www.catb.org/~esr/writings/cathedral-bazaar/
1129. http://www.businessweek.com/magazine/toc/05_02/B39150502manager.htm
1130. http://httpd.apache.org/dev/guidelines.html
1131. http://www.perl.com/doc/manual/html/Porting/pumpkin.html
1132. http://www.groklaw.net/article.php?story=20041107180408325
1133. http://www.uclibc.org/FAQ.html
1134. http://web.archive.org/web/20040411191201/people.redhat.com/~sopwith/old/glibc-vs-libc5.html
1135. http://dwheeler.com/essays/gpl-compatible.html
1136. http://www.xfree86.org/pipermail/forum/2004-February/003945.html
1137. http://www.xfree86.org/pipermail/forum/2004-February/003974.html
1138. http://linuxtoday.com/developer/2004021803026NWDTLL
1139. http://freedesktop.org/pipermail/x-packagers/2004-February/000001.html
1140. http://lwn.net/Articles/79302/
1141. http://dwheeler.com/essays/gpl-compatible.html#xfree86
1142. http://lwn.net/Articles/420774/
1143. http://dwheeler.com/contactme.html
1144. http://dwheeler.com/secure-programs
1145. http://dwheeler.com/sloc
1146. http://dwheeler.com/oss_fs_eval.html
1147. http://dwheeler.com/essays/scm.html
1148. http://dwheeler.com/essays/gpl-compatible.html
1149. http://dwheeler.com/oss_fs_refs.html
1150. http://dwheeler.com/essays/securing-windows.html
1151. http://dwheeler.com/essays/scm-security.html
1152. http://dwheeler.com/essays/spam-email-password.html
1153. http://dwheeler.com/innovation
1154. http://dwheeler.com/essays/stopspam.html
1155. http://dwheeler.com/essays/Fischer_Random_Chess.html
1156. http://dwheeler.com/flawfinder/
1157. http://dwheeler.com/sloccount/
1158. http://dwheeler.com/
1159. http://dwheeler.com/contactme.html
1160. http://validator.w3.org/check?uri=referer
1161. http://dwheeler.com/contactme.html


Usage: http://www.kk-software.de/kklynxview/get/URL
e.g. http://www.kk-software.de/kklynxview/get/http://www.kk-software.de
Errormessages are in German, sorry ;-)