copyright 1999, Rex Ballard
Business Case for Linux - 1999
In article <7ulijt$9ktnh$1@titan.xtra.co.nz>, "Stuart Fox"wrote: > > "R.E.Ballard (Rex Ballard)" wrote in message > news:7uj850$a1e$1@nnrp1.deja.com... > > In article <7ucia2$s2n$1@nnrp1.deja.com>, > > R.E.Ballard (Rex Ballard) wrote: > > > In article <38064DA4.74F0B79C@home.com>, > > > Abe Waranowitz wrote: > > > > > > > > Our close minded corporate IT Security Group > > > > > > First of all, don't make matters worse than they need > > > to be. Don't characterize them at all, especially as > > > "Close Minded". > > > > > > Fundamentally, they are either uninformed, or misinformed, > > > and are naturally concerned and fearful of the unknown. > > > This is natural for people, but is actually the duty of > > > security officers. > > > > > > > I took the liberty of putting some new links and references on > > http://www.open4success.com/index.html > > > > It's a series of pointers to help you build a case for Linux, > > not only as a good workstation, but also as > > a "Best Business Practice". > > What an enlightening site. > Can I ask where you got this figure from? > > "Of course, this same article then > extrapulates to assume that you will only > need 4 NT 4.0 servers (prior to SP3) to > support 1000-3000 users. Can you > imagine what that little claim will do to > Microsoft's credibility with IT > managers? The current industry average > is 24 servers/thousand and 4 > FTE's/1000 just for SERVER maintenance - > this is for 12/6 availability at 99.9%. " > > 24 servers per thousand. Where did you get this from?
One of my previous clients had over 100,000 workstations. Each user was connected through a local area network to several applications and functions including; Lotus Notes, shared files, databases, web servers, and custom applications.
They decided to aggressively adopt Windows NT based largely on the assumption that they would only need 200-300 servers (based on the article cited). Within 1 year of adopting this policy, they had over 2500 servers (last report through the grape-vine is over 3500), to service these users. Divide both numbers by 100 and you get a nice average of 25/1000.
Initially, the average up-time on a 12/6 expectation was 98.7%. Service packs, system tuning, and isolation of applications raised this to 99.8%. Migration of numerous applications to UNIX raised the NT uptime to 99.9% as promised by Microsoft. This excluded mandatory "maintenance" such as rebooting machines nightly.
Even after the migration, there were roughly 300 UNIX servers, mostly AIX and Solaris machines.
My staffing and other TCO estimates are based on the numbers actually experienced by this corporation. If they wish to declare themselves, the can do so.
Because some of my clients have experienced reprisals from Microsoft, I will not officially associate the clients with the figures. These records are carefully kept and archived.
In the past 5 years, I have consulted with numerous clients, and part of the job is to assess their overall IT situation. What are the strong points, what are the weak points. Shortly after Windows NT came out, most of the clients wanted help integrating UNIX, MVS, and NT machines with the intent of migrating as much as possible to NT. Today, most of the client want the same types of integration, but because they want to migrate as much as possible to UNIX. Many companies are seriously exploring migrations to Linux, including certain specialized desktop machines.
There is still a reluctance to go for wholesale adoption of Linux, and many are also exploring FreeBSD (since it can run most Linux binaries). But for things like Customer Service consoles, remote office servers, small branch offices, work-group servers, prototyping, and "1st-year" test marketing, Linux is becoming a very popular.
Generally, the thinking is still to move from Linux to Solaris or AIX once the project has proven to have an ROI sufficient to justify large Enterprise-class UNIX servers.
> > > > has threatened to shut > > > > down my 'Engineering Workstation' Linux box, > > > > because 'Linux is not supported and it has security issues.' > > > > (We're a HUGE communications company...) > > > > > > Process of elimination. AT&T has no problem. US West and > > > southwestern bell have no problem. Many of the major companies > > > regularly use Linux Workstations, and at least one company uses > > > Linux workstations in their mission critical environments. > > > > You must work for QWEST. Microsoft paid a huge premium for > > a controlling interest. If so, the best you can do is make > > the "hotmail" case. Microsoft DID try to convert hotmail > > to NT and failed. The original BSD systems used forks and > > Unix domain sockets - they tried to convert to threads on Sun, > > but even then it was obvious that NT 4.0 wasn't going to cut it. > You've got a real bee in your bonnet about forks haven't you?
The fundamental difference between the NT paradigm and the Linux/UNIX paradigm is essentially the fork(). Linux and UNIX were designed to run thousands of "itty bitty processes" that pass messages between each other using kernel managed memory mapped buffer exchanges which are essentially unstructured. Each buffer is passed intact. In UNIX, applications delegate a great deal of functionality to the kernel and numerous servers, which are either coresident, or accessible via network links. For example, the hard drive is managed and secured through web servers, mail servers, news servers, chat servers, directory servers, and about a hundred other "utility servers".
To improve performance and simplify the programming model, connections to the server are assigned to "forks". Each process is completely independent, and acts as if there was only one client connected to the machine. The client acts as if there were only one server. It is as if they were communicating with each other through a transparent serial interface.
The Windows NT paradigm is essentially an extension of the original MS-DOS paradigm which delegates as much functionality as possible to the application, providing only those functions absolutely essential to keep components (originally TSRs and Programs, now COM and DCOM objects) from corrupting each other.
The Microsoft paradigm initially seems quite pleasant. Applications can do many things they wouldn't be allowed to do in UNIX systems. They can share memory with the kernel, gain privileged access to devices, and even alter and control scheduling. When properly used, this can make Windows programs very powerful.
Unfortunately, this means that every application must have at least one person with the expertise of a system programmer, preferably with the ability to write an "application kernel" to prevent an application's own components from interfering with each other. Unfortunately, there is really no one making sure that components shared by multiple applications wont interfere with each other.
Microsoft has finally "seen the light", and will be introducing MTS as well as COM+ as options. This should improve reliability substantially. I wouldn't be surprised if we found that NT systems failed less than once a month when rebooted weekly. This will radically reduce the TCO of NT systems. It might cut the cost by as much as 70%. Unfortunately, it will still be higher than UNIX.
Unfortunately, this also means that applications must be redesigned to fit this new model. Just as many "ill behaved" MS-DOS applications could not be run on Windows, and many "ill behaved" Windows 3.0 applications couldn't be run on 3.1 and many "ill behaved" Windows 3.1 applications wouldn't run on NT or 95, and many "ill behaved" 95 applications wouldn't run on NT 4.0, many "ill behaved" NT 4.0 applications will not run on Windows 2000. Backward compatibility will continue to be a problem for Microsoft. Furthermore, this won't be the last round of problems and "ill behaved" applications.
Many software vendors have turned to Linux and UNIX out of frustration. Lotus Notes created it's first UNIX port of 1-2-3 when versions developed for 3.0 wouldn't run properly on 3.1. For the UNIX port, Lotus put many extra features, simply because it took less time to develop the baseline and they could add other features very easily. In fact, many of the features introduced in Excel were designed to provide a similar "look and feel" to Lotus 1-2-3 for UNIX (but not the same functionality).
UNIX (and Linux) have established a track record for backward compatibility while adding new capability. Programs written 20 years ago, or even 5 years ago will still run on the latest versions of Linux. The benefit is that this means that there are more resources available for new innovations.
Microsoft spend 2 years telling people that they will be providing features "sorta like" UNIX or Linux, and eventually deliver a disappointing product (literally since MS-DOS 2.0).
Linux distributors just say "we've got a new one", and try to get as many of the innovations as possible onto the box label (the list is out of date by the time the boxes are printed). Users just load it up and discover the new features for themselves.
Rex Ballard - Open Source Advocate, Internet I/T Architect, MIS Director http://www.open4success.com Linux - 52 million and growing at 3%/week!