Subject: Re: Server or E-mail? From: dowjone!rexb (Rex Ballard) Date: Tue, 19 Apr 94 13:00:17 EDT
How the Web Was Won
Subject: Re: Server or E-mail?
From: dowjone!rexb (Rex Ballard)
Date: Tue, 19 Apr 94 13:00:17 EDT
Sender: jvncnet!marketplace.com!owner-online-newspapers
Content-Length: 4533
X-Lines: 89
Status: RO
Before get ready to set up the E-Mail network, we need to consider the
evolution of the internet.
E-Mail has existed on UNIX systems almost since it's inception. The
common practice of using "Aliases" to distribute mailings via UUCP goes
back to the early 70's or before.
Of course, as these e-mail groups got popular, 20 or 30 people at the same
company would get on the same list. If the list generated 70 original
messages/day that could quickly exceed 1400 2100 messages/day through a
single mail server.
The next step was to develop a "broadcast" system. Now known as "News" or
NNTP. This delivered 70 messages to a single server and allowed users to
"Subscribe" to published mailing lists.
Over time, there was a threshold where there were too many groups and some
weren't even being read at all. Currently there are about 4000 News groups
on the internet generating about 50 posts/group/day average. There are
very few servers willing or able to archive 200 megabytes/day. This does not
include any commercial services such as DJ, McGraw Hill, Datatimes, or any
of the local newspapers. This 1000 byte/post average doesn't include the
GIFs, TIFs, and FAXs of the alt.pictures.* groups either.
The news distributions got so big that most hosts simply collect from 5 or 10
news servers. In addition, it got to the point where it was taking several
hours to sift through the mail.
Often, rather than repost information or post large pictures and source files,
people would simply point to an anonymous FTP server. A knowledgable user
could get another 20 megabytes/day of files, digests, and resources by using
FTP over the local line. For the less sophisticated user, or the user who
didn't have access to file transfer, it was more like a Tease. There was all
this wonderful information and no way to get it. Gopher made the files easier
to get.
What was needed was a smarter "Server" that could receive all of this
information, index it and deliver it to the user based on it's relevance to
a user's specific interests at that moment. This is where the WAIS server
began to evolve. Instead of sorting groups by group name and headline, the
entire posting was indexed and searched using a text retrieval system.
Now, all of this was fine for sophisticated college students and engineers
who made up most of the internet in 1989, but by 1991 there were more
"bean counters", managers, marketing people, and the idly curious. It got
to the point where you had to plan on giving a tutorial on FTP if you
referenced an FTP document.
Putting it all together was the job of MOSAIC and HTTP. With MOSAIC, you
could recieve very short blurbs, search for key words, and reference more
significant documents, pictures, and files under the guidence of an interface
similar to those used in the "Help Menus" of Mac, OS/2, Windows, and Unix
applications.
What's next?
Encryption/Authentication - Being able to identify the server and client
is critical to the commercial success.
"Blinking" An integrated dialer/ppp/mosaic interface, similar to "cello"
but capable of queuing up requests, retrieving requested links, and giving
overviews in quick calls over dial-lines is important for supporting the
home user until dedicated internet connectivity is available.
For the Corporate user, connecting LANs to the Internet WAN safely is another
important step. CERT is doing a great job of identifying and alerting users
of potential risks. Encryption and real-time authentication schemes (DES
encoded passwords) are also critical to promoting commercial on-line
publication. The issue here is that news arrives as unsolicited packets from
many different sources and must be authenticated as it arrives at the corporate
server.
Distinguishing "Human" protocols from "Machine" protocols is also important.
An SMTP message over an Internet link is usually very redundant. In addition,
the SMTP information is difficult for a machine to parse on the recieving end.
SGML and Postscript are CPU grinders. Especially when you factor in indexing,
sorting and managing "Text". bitmap graphics, and vector graphics. Standards
for "Application Layer Protocols" optimized for machine to machine interaction
are still relatively new and limited. To get a sense of what is possible,
consider ODBC vs. SQL. SQL is easier for the human to read and originate,
but ODBC enables many different kinds of front ends to access many different
back ends. Something similar is needed for News Distribution.
Rex Ballard
From jvncnet!marketplace.com!owner-online-news Wed Apr 20 12:36:28 1994
Return-Path:
Received: from eng.dowjones.com (dowjone) by dowv.eng.dowjones.com (4.1/SMI-4.1)
id AA00758; Wed, 20 Apr 94 12:36:27 EDT
Received: from jvncnet.UUCP by eng.dowjones.com (4.1/SMI-4.1)
id AA18738; Wed, 20 Apr 94 12:27:55 EDT
Received: from develop (marketplace.com) by tigger.jvnc.net with SMTP id AA25473
(5.65c/IDA-1.4.4 for dowjone!rexb); Tue, 19 Apr 1994 21:16:12 -0400
Received: by develop (5.67/1.37)
id AA01737; Tue, 19 Apr 94 17:33:24 -0600
Received: from tigger.jvnc.net by develop (5.67/1.37)
id AA01711; Tue, 19 Apr 94 17:29:57 -0600
Received: from dowjone.UUCP by tigger.jvnc.net with UUCP id AA14225
(5.65c/IDA-1.4.4); Tue, 19 Apr 1994 18:21:34 -0400
Received: from dowv.eng.dowjones.com by eng.dowjones.com (4.1/SMI-4.1)
id AA16842; Tue, 19 Apr 94 12:51:50 EDT
Received: by dowv.eng.dowjones.com (4.1/SMI-4.1)
id AA00603; Tue, 19 Apr 94 13:00:17 EDT