Free/Open Source Software has been found to be more reliable than proprietary software of the same type
There is a general notion that OSS/FS is more reliable than their proprietary counterparts. We will examine the validity of this notion here.
References
David, A. (2007). Why Open Source Software / Free Software (OSS/FS, FLOSS, or FOSS)? Look at the Numbers! Retrieved from http://www.dwheeler.com/oss_fs_why.html
Other important sources
Robert, C & Richard C. (2004) . Free and Open Source Software. Overview and Preliminary Guidelines for the Government of Canada. Retrieved from www.sita.co.za/FOSS/Gov_Canada-OSS_Guide-Dec04.pdf
Allen, G. (2008). Good to Great FOSS: Learnings from Africa . Retrieved from www.aspirationtech.org/files/GoodToGreatFOSS-LearningsFromAfrica.pdf
Kenneth, W.(2004). Free/Open Source Software: Government Policy. Retrieved from http://www.sita.co.za/FOSS/Gov-OSS_Guide-04.pdf
- GNU/Linux is more reliable than Windows NT, according to a one-year Bloor Research experiment. Bloor Research had both operating systems running on relatively old Pentium machines. During the one year test, GNU/Linux crashed once due to a hardware fault (disk problems), which took 4 hours to fix, giving it a measured availability of 99.95 percent. Windows NT crashed 68 times, caused by hardware problems (disk), memory (26 times), file management (8 times), and various odd problems (33 times). All this took 65 hours to fix, giving an availability of 99.26 percent. It’s intriguing that the only GNU/Linux problem and many of the Windows problems were hardware-related; it could be argued that the Windows hardware was worse, or it could be argued that GNU/Linux did a better job of avoiding and containing hardware failures. The file management failure is due to Windows, and the odd problems appear due to Windows too, indicating that GNU/Linux is far more reliable than Windows. GNet summarized this as saying “the winner here is clearly Linux.”
- German import company Heinz Tröber found Linux-based desktops to be far more reliable than Windows desktops; Windows had a 15% daily failure rate, while Linux has 0%. Günter Stoverock, the data processing manager at German import company Heinz Tröber, reported that they had decided to run its ERP software on Linux-based systems, instead of Windows, because Windows was much less reliable. Stoverock stated that on Windows, “Out of 65 desktops, around 10 desktops crashed daily... Employees wasted around 30 minutes, that’s five times 30 minutes per week.” Note that this is a 15% daily failure rate, and the actual impacts were almost certainly more severe than simply a loss of 2 minutes of lost time per reboot. After all, this generous calculation ignores the cost of lost time due to lost data (requiring re-entry), time to restart whatever action they were doing, and the time for people to regain their focus on what they were doing. Stoverock then stated “That’s not acceptable - we had to do something [to solve this].” The company switched to Linux desktop systems in 2001, and has had no downtime at all since (through March 2005). He reported that “There are no problems - in the morning you turn the computer on, in the afternoon you turn it off - that’s it.”
- Equivalent OSS/FS applications are more reliable, according to the Fuzz study. The paper “Fuzz Revisited” measured reliability by feeding programs with random characters and determining which ones resisted crashing and freeze-ups. The report in section 2.3.1 states that: "It is also interesting to compare results of testing the commercial systems to the results from testing “freeware” GNU and Linux. The seven commercial systems in the 1995 study have an average failure rate of 23%, while Linux has a failure rate of 9% and the GNU utilities have a failure rate of only 6%. It is reasonable to ask why a globally scattered group of programmers, with no formal testing support or software engineering standards can produce code that is more reliable (at least, by our measure) than commercially produced code. Even if you consider only the utilities that were available from GNU or Linux, the failure rates for these two systems are better than the other systems. OSS/FS had higher reliability by this measure. There is evidence that Windows applications have even less reliability than the proprietary UNIX software (e.g., less reliable than the OSS/FS software). A later paper published in 2000, “An Empirical Study of the Robustness of Windows NT Applications Using Random Testing”, found that with Windows NT GUI applications, could crash 21% of the applications they tested, hang an additional 24% of the applications, and could crash or hang all the tested applications when subjecting them to random Win32 messages. Thus, there’s no evidence that proprietary Windows software is more reliable than OSS/FS by this measure.
- IBM studies found GNU/Linux highly reliable. IBM ran a series of extremely stressful tests for 30 and 60 days, and found that the Linux kernel and other core OS components - including libraries, device drivers, file systems, networking, IPC, and memory management - operated consistently and completed all the expected durations of runs with zero critical system failures. Linux system performance was not degraded during the long duration of the run, the Linux kernel properly scaled to use hardware resources (CPU, memory, disk) on SMP systems, the Linux system handled continuous full CPU load (over 99%) and high memory stress well, and the Linux system handled overloaded circumstances correctly. IBM declared that these tests demonstrate that “the Linux kernel and other core OS components are reliable and stable ... and can provide a robust, enterprise-level environment for customers over long periods of time.”
- GNU/Linux is more reliable than Windows NT, according to a 10-month ZDnet experiment. ZDnet ran a 10-month test for reliability to compare Caldera Systems OpenLinux, Red Hat Linux, and Microsoft’s Windows NT Server 4.0 with Service Pack 3. All three used identical (single-CPU) hardware. Network requests were sent to each server in parallel for standard Internet, file, and print services. The result: NT crashed an average of once every six weeks, each taking about 30 minutes to fix; that’s not bad, but neither GNU/Linux server ever went down. This ZDnet article also does a good job of identifying GNU/Linux weaknesses (e.g., desktop applications and massive SMP). Hopefully Windows has made improvements since this study - but the OSS/FS have certainly made improvements as well.
- 80% of the top ten most reliable hosting providers ran OSS/FS, according to Netcraft’s May 2004 survey Netcraft’s May 2004 survey of the top ten most reliable hosting providers found 4 running GNU/Linux, 4 running FreeBSD, and only 2 running Microsoft Windows.
References
David, A. (2007). Why Open Source Software / Free Software (OSS/FS, FLOSS, or FOSS)? Look at the Numbers! Retrieved from http://www.dwheeler.com/oss_fs_why.html
Other important sources
Robert, C & Richard C. (2004) . Free and Open Source Software. Overview and Preliminary Guidelines for the Government of Canada. Retrieved from www.sita.co.za/FOSS/Gov_Canada-OSS_Guide-Dec04.pdf
Allen, G. (2008). Good to Great FOSS: Learnings from Africa . Retrieved from www.aspirationtech.org/files/GoodToGreatFOSS-LearningsFromAfrica.pdf
Kenneth, W.(2004). Free/Open Source Software: Government Policy. Retrieved from http://www.sita.co.za/FOSS/Gov-OSS_Guide-04.pdf