Response to Microsoft's "Linux Myths"
copyright 1997, Rex Ballard
In article <5EzK3.5071$Pf4.27567@news.rdc2.mi.home.com>, "Drestin Black"wrote: > Do I agree with everything written. No. All companies engage in marketing > and marketing sometimes exagerates. But, I believe there is some substance > herein. > > http://www.microsoft.com/ntserver/nts/news/msnw/LinuxMyths.asp
This is a very interesting article. The actual nature of it is even more interesting. In the context of the discussions in these usenet groups it is actually ironic to the point of being hilarious.
It is a bit hard to follow. Several of the links have "Netscape Killer" javascript that locks up Netscape on either Linux or NT or 95. This is actually my third attempt to post this article. The last time, I tried it on NT and got a B.S.O.D. I strongly suggest that you have you boss get on NT, Netscape and go to the reference on "37% lower TCO".
Myth: Linux performs better than Windows NT
UnReality: Windows NT 4.0 Outperforms Linux On Common Customer Workloads
When the referenced article:
http://www.zdnet.com/products/stories/reviews/0,4161,1015266,00.html
First the article was written in June 1999, prior to the release of the kernel which supports multiprocessor routing of multiple NIC cards. The tested configuration used dual 100/T cards.
A rebuttal posting (search the "Mindcraft Benchmarks" thread for URL), identified the problem. Essentially, the summary was "If you want to serve cached static pages from dual NIC cards, you should use Netscape/IIS, for more typical real-world applications, Apache should do quite well". Microsoft did close the gap - typically Linux was only 20-30% faster in the "real-world" configurations.
Appearantly, both systems were configured with defaults, which meant that Apache was doing reverse DNS while IIS was logging only dotted decimal addresses.
Finally, the version evaluated was not the khttpd "kernel hook", and IIS runs much of it's code under kernel mode. I haven't seen the comparitive benchmarks, but it's interesting to see that Microsoft is trotting out these already discredited benchmarks against obsolete software as it's case for NT superiority.
In fact, there have been two "Major" releases of Linux since the version on which those benchmarks were run. I notice Microsoft isn't bragging about recent benchmarks. This may not be fair, but it does say something about the responsiveness of the Linux community.
A similar set of benchmarks was run to break-down the benchmarks used for performance testing of the SMB server. In this case, it is true that NT is slightly faster when run in a "custom tweaked" configuration where journal, files, and swap space are kept on separate spindles and Linux is running Swap and e2fs on the same spindles.
In addition, it was noted that the benchmark stressed smaller directories containing very large files which exploite the larger clusters of NT and play against the i-node structure of e2fs. Linux excels in filesystems containing directories filled with hundreds of smaller files, such as HTML text, usenet news articles, email broken into 1 file per message (mh-style). Since this means that to run large numbers of smaller objects, one MUST rely on a relational database to get better performance out of NT, the perceived performance gain may be less than claimed.
But Microsoft trots out this stuff as authoritative proof that NT will outperform Linux in every imaginable situation.
A detailed breakdown of test results indicates the type of tuning that would be required to get the best performance out of each system, but it also dramatically illustrates how heavily weighted these benchmarks are tuned to favor NT stregths and minimize NT shortcomings.
> Myth: Linux is more reliable than Windows NT > Reality: Linux Needs Real World Proof Points > Rather than Anecdotal Stories
This is one that IT managers are starting to sort out for themselves. They see REAL budgets for NT spiraling at an exponential rate as number of servers is increased to add functionality which increases the number for CALS.
Many IT managers are keeping a much closer look at the availability of NT and UNIX systems. Microsoft "guarantees" 99.9% availability. In other words, if your system is down more than 10 minutes/WEEK, 45 minutes/MONTH, or 120 minutes/QUARTER - excluding scheduled maintenence, they will --
- refund the price of the NT server license?
- provided you remove NT from the processor?
Of course, with Edsel Murphy on the NT development team, it appears that NT could spend the entire 120 minutes blowing away the best two hours of the quarter. You're loosing $2 million dollars a minute to a blown server, and Microsoft will generously refund $1000 of the $1/4 million you just lost?
Actually, if you read the "fine print", they only promise to conduct an investigation to determine what factors, outside the system, caused the failures.
Would you like to buy some land in Colorado? About 200 miles due east of Pueblo? (hint - bring LOTS of water and at least 50 gallons of extra gas for your 4x4).
The UNIX vendors are competing toe-to-toe with Linux and are now touting availablity of 99.999% or about 5 minutes/YEAR. One UNIX vendor actually touted uptime in parts per million - 1ppm is 30 seconds/year.
Actually, it's such a greased slide from Linux to AIX or Solaris, that most IT managers use Linux for development, pilots, and test markets, and then switch to AIX or Solaris for "Full Blown Production". If you use NT for the small-scale prototyping, where do you go when you need "Big Iron"?
Microsoft also inaccurately asserts that there are no clustering technologies by using a carefully worded term "commercially proven clustering technologies", which is a bit like Novells clever control of the definition "Network Operating System" to exclude UNIX, TCP/IP, and NFS or SMB. Even when the Internet had eclipsed Novell's market with 10 million TCP/IP capable machines in 1994, Novell claimed to have 80% of the "Network Operating System" market. Of course, the price of Novell's stock dropped 70% that same month.
Microsoft also touts features such as the Journalling file system as flaws in Linux. Microsoft NEEDS a JFS because it fails more often in catastrophic ways. Linux does have some reliability features, and there are JFS filesystems available as modules. But JFS has a huge performance penalty, and would be redundant for systems such as Sybase, DB/2, or ORACLE Databases, and is less necessary for flat-files, especially when they can be journalled using revision control systems. NT stores all objects in binary formats and most applications don't have structured recovery systems. If I have a modified file opened in VI, and the system get's rebooted, I can run vi -r. Most UNIX/Linux applications have optimal recovery procedures built-in to them.
> Myth: Linux is Free > Reality: Free Operating System Does > Not Mean Low Total Cost of Ownership
Microsoft is half right. Linux is not free. All profit for selling Linux comes from selling service contracts. If all you need is a little telephone support getting it installed, it's only $30-80 bucks. If you want 24/7 full hand-holding for every member of your staff, LinuxCare and IBM Global Services have contracts available - but they are a bit more expensive.
Of course, Microsoft also cites a 1997 article (don't follow this link in netscape with JavaScript enabled, it has the "killer code" as of midnight 10/8/1999) which "proves" that NT 4.0 is 37% cheaper than the equivalent functionality provided by a combination of UNIX and NetWare.
Given that Netware 4.0 had just come out and wasn't doing very well, and you needed two servers, two different skill sets, and two different protocol stacks in the clients, it's not surprising that NT might have been cheaper for 50-100 users.
Of course, this same article then extrapulates to assume that you will only need 4 NT 4.0 servers (prior to SP3) to support 1000-3000 users. Can you imagine what that little claim will do to Microsoft's credibility with IT managers? The current industry average is 24 servers/thousand and 4 FTE's/1000 just for SERVER maintenance - this is for 12/6 availability at 99.9%.
The Quote sheets from most ISP hosting services gives a much different story. Typically, the "starter system" for Linux is cheaper, and the cost/performance ratio improves as you increase capacity. Some ISPs are now quoting "Dry Servers", which means the quoted support prices for NT do not include Server licenses or CALS.
> Myth: Linux is more secure than Windows NT > Reality: Linux Security Model Is Weak
Another half-truth. Microsoft assumes that Linux users would use SMB direct access to a secured system. Given that Linux can serve as it's own firewall, each TCP connection can be kerberos protected, each TCP/process and application can be managed using links, chroot, and other access control systems. In fact, you would have a real hard time getting a ISP to give you "Telnet Access" to an NT server.
The reality is that Linux comes with a set of defaults that is superior to Windows 95 and superior to NT servers with 95 workstations attached. Security measures are built into the servers. Most servers come with pluggable authentication modules that can be set for certificates, kerberos, or even security systems that are illegal outside the United States.
When NIS is enabled, netgroups, which are similar to NT access control lists are available. The CORBA objects can also be security enabled.
What Linux really lacks is pseudo security. The "fluffy" security that makes managers think that no one can read their mail, while the use of an ActiveX control installed on a server can allow a disgruntled employee to read his e-mail (and send some) undetected.
> Myth: Linux can replace Windows on the desktop > Reality: Linux Makes No Sense at the Desktop
Another set of half-truths. The Linux organization was primarily focused on Linux as a server until July 18, 1998. In less that one year, the picture has changed radically. Today, there are entire suites of applications focused at desktop users that didn't even exist the prior year.
Many of these packages are even easier to use than the Windows equivelants. In some cases, the same qt toolkit was used to develop the Windows versions.
> Summary > The Linux operating system is not suitable > for mainstream usage by business > or home users. > Today with Windows NT 4.0, customers can be confident in > delivering applications that are scalable, > secure, and reliable--yet cost > effective to deploy and manage.I can think of a few former NT Server project managers who might have less enthusiasm for that statement. Most of the survivors have moved their scalable products to UNIX.
Several Fortune 500 companies and government agencies have disabled ActiveX on browsers and have blocked access to the verisign servers for ActiveX signatures.
After Melissa, the "*Zip", and some of the nastier viruses that have wiped out entire e-mail systems and client file-systems, touting NT as secure is really a stretch.
A more accurate statement is "NT 4.0 is the most scalable, secure, reliable, and cost-effective system MICROSOFT has ever produced". This relieves it of the burdon of those dubious comparisons to UNIX and Linux.
> Linux clearly has a long way > to go to be competitive with Windows NT 4.0.
Which is why Linux has been deployed at a slightly faster rate than NT 4.0. Linux had 17% of the server market based on UNIX sold in 1997. What was not mentioned in the IDC report was that operators were making multiple installations of Linux for each unit purchased. The industry average seems to be about 5 copies per unit sold. A more recent survey indicates that the Linux share grew to 34% of the market while NT shrank to 24%.
These days, it's harder to sort the servers from he workstations. Linux appears to have about 50 million users based on the unit sales and the cloning. Cheap CD-ROM burners have created a Linux epedimic in the public schools. Linux installations via Corporate LAN is also getting quite popular.
The best indicators are secondary sales, such as MS-DOS, PC-DOS/2000, Partition Magic, System Commander, and applications such as WordPerfect (1 million downloads/month).
Of course, it's also getting to the point where a supported copy of Linux for $28 dollars that includes BootMagic and Partition Magic as a package should be doing some really interesting things to the numbers.
> With the release of the Windows 2000 > operating system, Microsoft extends the technical superiority of the > platform even further ensuring that customers can deliver the next > generation applications to solve their business challenges.
In other words "Windows 2000 will be better than UNIX"!
Haven't we heard that one before:
- In 1991 when Bill Gates first announced 3.0.
- In 1992 when Bill announced NT.
- In 1993 when Bill announced Chicago.
- In 1994 when Bill announced Windows 95.
- In 1995 when Bill announced NT 4.0
- In 1996 when Bill announced Windows 97 - oops 98.
Maybe it's true. Maybe Windows 2000 will be better than SunOS 4.0 (the version of UNIX running on Sun systems back in 1990). But from the previews of Windows 2000 that I have seen thus far, Windows 2000 doesn't even come close to being as good as Linux or Solaris.
Meanwhile, hundreds of new applications packages are coming out for Linux, and each release get's friendlier and friendlier. To top that, Linux empowered computers (LCs) will be showing up on retail shelves later this year. Who knows, they may even put it next to the Imacs.
In addition, all versions of Linux are now taking the pain out of repartitioning hard drives for Linux. In fact, Linux can even read your Fat32 partitions. Can NT? can NT 2000?
And while Red Hat released 6.1 and everybody else readies their follow-ups, Microsoft has a few patches you need to apply to get Office 97, Office 95, and Windows 95 to be Y2K compliant. The only problem is that it might cause problems for some of your 3rd party applications. - - bummer.
Rex Ballard - Open Source Advocate, Internet I/T Architect, MIS Director http://www.open4success.com Linux - 50 million and growing at 3%/week!