Open4Success: Miracles,Commercialization of the Internet, Linux, Open Source

Welcome to Open4Success

-- Rex Ballard Tux, the Happy Linux Penguin The BSD Daemon

Home | Open Source + Open Mind = Unlimited Abundance | Articles

Unlimited Abundance

Bio from Landmark Education Wisdom Course

Find out about the Landmark Forum

Landmark Education

Re: Conspiracy - Re: A challenge: A Linux office?

Copyright 1999 - Rex Ballard

Date: 1999/01/05

Author: r.e.ballard

In article <76ll9m$9vp$1@inconnu.isu.edu>,
  ink@inconnu.isu.edu (Craig Kelley) wrote:
> In article <76ksh9$hvt@pell.pell.portland.or.us>,
> david parsons  wrote:

> If Microsoft had invented the internet, we would have a 'distributed > object-oriented system of controls' in which everything would be > compiled to run on an Intel/Win32 system.

Actually, I was indirectly involved in Microsoft's attempts to create the "Commercial Internet" when I was working at Dow Jones in 1993-1995. Microsoft's vision was to use NT powered Exchange servers that would effectively "SPAM" content directly into Corporate Exchange servers which would then push the content in a manner similar to Lotus Notes. Of course, all documents would be Microsoft Office Documents which would be handled by Microsoft Office applications which would automatically be launched when you opened each document.

There were only a few problems. NT 3.51 couldn't handle more than 50 users, the "Forms" APIs weren't delivered (ever), and the project, which was supposed to take only 6 months was incomplete after 2 years.

Meanwhile, back at the ranch, I had been working with 3000 publishing experts to exploit the HTML browser (Cello & Mosaic initially), the NCSA HTML server, and the WAIS text search engine to create one of the largest repositories of news archives in the world. I introduced them to Linux, initially SLS and later Slackware. By the time Red Hat came out, there were over 2000 Linux powered prototypes and pilot sites. Back in those days, there was no budget and Linux was the only way to stage a proof-of-concept. Dow Jones was the first Nationally Branded web site (DowVision). I was also trying to set up a mechanism that would establish Dow Jones as what we now know as a "portal", a central site through which the other sites would be searched an accessed.

> Your typical web server would send an OLE version number along with dumps of control code which all could be obsoleted by newer browsers and development tools.

When I joined McGraw-Hill in mid 1995, I wrote a proposal that was sent through Walt Arvin of S&P Marketscope which caused Joe Deon to order all 179 McGraw-Hill companies/publications to have web sites by the end of 1995. Brian Whitehead wanted the site to be Microsoft centric. They tried to design a site that used NT 3.51 servers with IIS and delivered Word documents, Excel Spreadsheets, and Powerpoint pages, via TCP/IP. The HTML functioned only as a "table of contents". Unfortunately, the system required a very large workstation, required NT 3.51 workstations, required thick-clients on each end, and created security holes you could drive a truck through. It was also slower than molassas in january, and there was no effective way to index thousands of pages of content. Marketscope alone generated over 5000 documents/day. After nearly 2 years of failed attempts to create the "Microsoft Solution", the entire Internet project was moved out of his building so that it wouldn't be corrupted by the Microsoft biggots.

> If you didn't keep your version of Windows up-to-date then the web would slowly stop working for you ("Sorry, this website requires that you have an ActiveX-whatever-we're-calling-it-today version 6.01.23 revision Z installed running under IA64/step7 with ActiveMMX and that you have DirectChrome 2k [which can be downloaded in 8 easy-to-get 100MB chunks from our Connexion servers which will probably go down on you in the middle]" -- of course, you won't be able to download those controls from Connexion because the ActiveFTP protocols are newer than your version of Microsoft WindowsExplorer can use so you just break down and buy Windows 2001).

Actually, I am currently working for a company what chose not to upgrade to Microsoft Office 97 and was using NT 4.0 as workstations. It is not possible to view the Microsoft Annual Report without a special set of "Office97 plug-ins" that provide read-only capability (you can't even save to disk). Worse, the plug-ins corrupt the registry and wipe-out the legitimate Office95 bindings. We have also had a number of consultants send us Office97 format documents. When they resend these documents using Office95 formats, paragraphs are strangely formatted, pictures are wrapped, spreadsheets are clipped, and graphs are unviewable.

Myth - Unix is Obsolete

> ->>One of the reasons why so many things in Unix suck is the fact they're 20 years old and have been designed for a completely different reality than today (in this example, ASCII encoding to allow use in primitive environments -- you don't even need a client program to read your mail, and no use of compression due to the low-powered CPUs of the 70's).

UNIX is an evolutionary platform. Rather than having a "revolution" every 12-24 months that renders technology obsolete (to the point of being nonfunctional on a system) in 3-5 years, UNIX takes a different approach. To begin with, UNIX was originally designed as a multimedia operating system. AT&T used it to switch/control audio and even graphics and video under it's #5ESS and similar switching systems. Nortel also used UNIX as it's switching engine. The entire system was designed to pass streams of bytes at very high speeds using pipelines and high speed serial connections. In fact, each process chooses how to parse it's input stream and how to format it's output stream. In many cases features such as compression, encryption, and encapsulation are transeparent to each other.

UNIX used this same "stream of bytes" concept to implement it's GUI interface. Much of the success of UNIX was owed to the fact that you could use surplus terminals as workstations and not be restricted in your applications. Text terminals could run curses with a minimal library. If you had a more sophisticated terminal, you could create more sophisticated termcap entries, but your application software only needed to call the curses routines.

When X11 came out, they wanted to be able to use UNIX hosts as cluster controllers to graphics terminals such as the Tektronics 4010, VT 330, and other graphics oriented terminals. They also wanted to be able to create X terminals from PC hardware which was cheap and plentiful.

Ironically, X11 was available for the PC long before a working and useful implementation of Windows was available. In fact, Windows 3.0 was less reliable, more expensive, and required more frequent hardware upgrades than the Sun IPC which was the competitor Sun had created. In fact, had SCO and Interactive been allowed to sell UNIX for $300 instead of $3000 (for a full suite including X11, TCP/IP, NFS, Developer Kit, and Online Docs), Microsoft would probabably have been driven off the desktop in 1991.

This was actually the incentive for supporting Linux. A number of UNIX users saw that the Intel 386 processor was powerful enough to provide a real UNIX platform. They realized that UNIX, running on a 32 bit Intel based PC, complete with X11, TCP/IP, Documentation, and some good applications and implementation languages (compilers and interpreters that could be used to create applications and application frameworks) would make a number of things possible. First, it would create a viable contender to Microsoft. Second, it would make it possible to design systems that were platform ignorant. Third, combined with the Internet, it would make real-time electronic commerce and electronic information sharing possible which would enable optimal deployment of global resources, at substantial profits.

Less than 18 months from the first "unveiling" of Linux, Linux had grown from a primative kernel to a fully functional UNIX implementation. Because it was unencumbered by AT&T copyrights, there were no minimum price policies. Linux actually emerged in 1993 as a viable platform. At that time, SLS UNIX was superior to Windows 3.1 and Slackware was giving the Windows NT betas a run for the money (no one was allowed to share this information due to nondislcosure agreements).

By the end of 1994, Yddragasil Linux offered a feature called "Plug-n-Play" which included a set of diagnostics that would help the user configure the keyboard, mouse, hard drives, network cards, and video cards. This was no small feat considering this configuration was being done on ISA, VLB, and EISA busses.

Microsoft was so threatened by this "self-installing UNIX" that they delayed Windows 95 nearly 8 months to promote a "revolutionary" standard which was based on PCI and a proprietary "PnP" technology which was protected by strict nondisclosure clauses and restrictions on "reverse engineering". Ironically, many Windows 95 device drivers were debugged using Linux, but the penalty for disclosing PnP was to be have your drivers and support libraries excluded from the Windows 95 CD-ROM when it was released.

To prevent the proliferation of Linux coverage, any publication that gave positive coverage to Linux was penalized by the withdrawal of one full-page ad. Since Microsoft not only controlled it's own advertizing, but also the placement of OEM Co-op advertizing, the publishers were very reluctant to cross Microsoft.

In late 1995, Microsoft released Windows 95, which did horrible things to VLB machines, required RAM and hard drive upgrades, and forced most corporations to spend over $10,000/user upgrading the PCs. Many companies had already paid $10,000/user upgrading to NT 3.51. Many were "holding out" for NT 4.0.

When NT 4.0 came out, Microsoft did everything they could to promote it as the "final solution" to UNIX. They promoted it as a server, as a workstation, as an Enterprise Server, as a web server, as a Naming Service, as a File and Print server, and as a general application server. Microsoft offered huge incentives to consulting firms and applications vendors who were willing to scuttle UNIX based projects in favor of NT projects. By March of 1997, it looked like the entire world would be running NT Exclusively in 6 months. At least that's what Microsoft wanted the purchasing managers to believe.

Microsoft then started slitting it's own throat. It started going directly after the key revenue streams of local newspapers, including syndicated newspapers and USA today. They started targeting travel, real estate, apartments, employment, and personals. Revenue that was usually being spent on classified advertizing was being diverted directly into Microsoft's coffers, often for the express purpose of crippling the web-sites of local newspapers.

The second blow for Microsoft was that NT failed to live up to the expectations set by Microsoft. NT projects were turning into runaways. Some projects that had originally been estimated as 3 staff-month projects had been dragging on as much as 2 staff-years. The few projects that were completed crashed in production environments. Microsoft tried to stem the exodus by offering clustering. When "Wolfpack" was finally released, it was even worse because the secondary servers were only "hot standbys". NT projects were turning into a project manager's nightmere. Corporations began frantically searching for project managers who could successfully salvage NT projects. Ironically, many of these projects were "salvaged" by converting them to Linux and not telling upper management. In some cases, the only clue that a PC was running Linux instead of NT was that there was no monitor attached (since X11 allowed the entire network of servers to be managed through a single console).

Microsoft also slit it's throat by trying to wipe out the third party market entirely. SQL server tried to displace Sybase, Oracle, and Informix. IE tried to displace Netscape, HotJava, and Java. Word had already displaced WordPerfect and any other serious word processor. Office had displaced offerings by Lotus, Corel, Visio, and Semantic. Microsoft had even started to go after verticle markets, attempting to displace ERP, OLAP, and Analysis packages such as SAS, SAP, and Infopump with it's own "frameworks", often Microsoft tried to cut out competitors by going after the consulting firms who supported these functions. Microsoft also tried to wipe out Java Applets and CORBA with ActiveX and DCOM. Microsoft was even resorting to blackmail and extortion to prevent new emerging technologies such as USB, RealVideo, and Interactive Videoconferencing from being adopted. Microsoft may have even tried to coerce Intel into delaying the Merced chip until an NT 5.0 port could be released.

By September of 1997, the press was more than willing to cover Microsoft negatively. Furthermore, Linux was rapidly becoming a "David and Goliath" story. Linux had made further inroads that any Microsoft competitor since the initial release of the Apple Macintosh in 1984. Linux was actually able to effectively compete and THRIVE in the face of one of the most comprehensive "lock-out" campaigns ever devised. Not only did Microsoft create legal barriers to "dual-boot" systems (Linux and Windows 95 or Windows 98 could have easily been included on new PCs, especially machines with 4 gig and 6 gig drives), but the Windows 98 upgrade reclaimed the entire boot sequence and attempted to effectively "wipe out" the Linux boot sequence. Several Linux vendors offered free or cheap (under $2) copies of Linux to OEMs, but Microsoft agreements prevented the OEMs from installing the new operating system.

The next big blow was an attempt to redefine licensing for Microsoft's large direct customer base of Fortune 1000 companies. With Office97, Microsoft switched from a "per user" license to a "per processor" license, which meant that companies who had employees who worked at home now had to buy 2nd and third (laptop) copies of Office 97. With Microsoft Merchant server, Microsoft redefined the client access license "concurrent user" as number of unique users in a 1/2 hour period. In effect, a system licensed for 100 concurrent users/processor suddenly jumped to 18,000 users/processor. At $20-$50/user, the license fees were suddenly traumatic. Microsoft has been gradually extending the period which defines concurrent users. Microsoft is also using SQL queries and result sets as an indicator of how many users are "concurrent". For HTTP, a user makes a series of very short connections, but the "back-end" machine can string together sessions of 10-20 page-views that may be "active" for as long as an hour.

Unfortunately, by biting the hand that feeds it, corporations were forced to cut staff to fund the price increases. The end result is that IT departements are now very interested in Linux, FreeBSD, even SOLARIS, as an alternative to NT. They are even starting to consider Linux, coupled with commercial Office Suites and Databases as alternatives to Microsoft's solution.

Previews of Office 2000 and Windows 2000 indicate that the situation won't be getting any better. Office 2000 now requires 6 CD-ROMs or a DVD disk to install nearly 6 gig of software, videos, and animations that make the paper clip look efficient. Windows 2000 appears to still have severe performance limitations, and reliability may actually be getting worse. The introduction of Microsoft Transaction Server is supposed to simplify development by hiding the multithtreading (sounds a bit like UNIX to me), but MTS doesn't do load balancing, fault tolerance, fallback and recovery, and even the two-phase commit is only marginally useful. As usual, the details are restricted by nondisclosure, but whispers and discussions on the Internet indicate that Windows 2000 will be about as effective against Linux as Windows 98 was (many users hate windows 98 so much they don't even try for Dual-boot, they just replace it with Linux).

  
 > ->  And this is a problem?  It's *GOOD* to have backwards
 > ->  compatability and easy to pick apart protocols so that everyone
 > ->  and their sister doesn't have to `upgrade' (the curse of every
 > ->  computing environment) when you tweak something New! And!
 > ->  Improved! into the protocol and, purely from an administrative
 > ->  viewpoint, when the system goes balls-up you've got a chance to
 > ->  diagnose and fix the problem.

It's also important when you are trying to audit a system for security purposes. When a hacker's message enters a UNIX or Linux system, it can be traced back to it's source. Assuming the ISP and Firewall are secure (not set by RIP), the routing can usually be traced back to the originating ISP, who can often provide the ANI information of dial-up users.

If you are dealing with financial services such as Banking, Securities, Employment, Real Estate, Credit Cards, or other regulated industries, you want to be able to assure that the traffic going in and out of the "Bank" can be audited for compliance. In some cases, there are literally hundreds of regulatory agencies involved. Microsoft's systems make no attempt to even provide the framework for compliance. Without the source code for the infrastructure, there is no way to provide the auditing required for compliance.

  
 > Amen.
 > Craig Kelley  -- kellcrai@isu.edu
 > http://www.isu.edu/~kellcrai finger ink@inconnu.isu.edu for PGP block
 Nice pictures.

-- 
Rex Ballard - Open Source Advocate, Internet 
I/T Architect, MIS Director
http://www.open4success.com
Linux - 50 million and growing at 3%/week!