Subject: Re: PDF: user perspectives From: "Claude L. Bullard" Date: Wed, 29 Mar 1995 15:14:39 -0500
How the Web Was Won
Subject: Re: PDF: user perspectives From: "Claude L. Bullard" Date: Wed, 29 Mar 1995 15:14:39 -0500
Lines: 90

Geoffrey James in his book "Document Databases" (ISBN 0-442-28185-4) notes
the following. "Electronic pagesetting is designed primarily for newspapers
and magazines, single format presentation that are seldom reused.  The new
publications technology addresses the problem of once-only publications
(memos, newspapers, etc.) but fails to offer a solution to the information
crisis... certain time factors remain constant with or without word
processing technology...Something was missing in their strategy, an
omission that greatly reduced the productivity gains they expected.  They
only changed technology, not methodology."

James goes on to compare this to the Gutenberg myth, that is, that the
invention of the printing press had a profound effect on Western culture
and says it isn't true.  He supports this with the following: at the time
of the inventing, the largest European library had about a thousand
hand-printed volumes.  After a hundred years, only 3000 volumes were added.
Why?  Each printed book was still hand crafted to resemble a hand-lettered
manuscript so that only the very wealthy could afford the books.  Only when
manufacturing methods that took into account cost vs style (e.g., use
italics over hand-illumined fonts to reduce the size and complexity of type
and increase the number of units per print run) did any real revolution in
knowledge occur.  A change in technology improves things arithmetically.
The change in methodology improves things factorially.  That is when cost
comes down, distribution increases and the integration of information into
knowledge advances.

I have asserted before that until the WWW community begins to understand
the insistence by the developers of HTML and now Acrobat on the *richness*
of display over the *richness* of improved location models severely limits
on the application of document databases on the Internet, there will be no
real evolution.  Until the adoption of international standards, not
"defacto" standards in this area begins in earnest, there will be only
limited improvements and each one of them will further subjugate the
Internet content to increasingly proprietary technologies.

You cannot fight the *big money* and *vested interests*.  What they are
doing is completely within the rules of commerce no matter how we think it
may impact us.  It is unclear if we can outthink them.  Standards-centered
groups and even the IETF operate in the open.  The commercial industry does
not.  They can harvest your ideas as soon as you publish them.  They can
rewrite them, patent or copywrite them, and use them to extinguish your
efforts.  The involvement of NIST in this makes it even more difficult
because the inked deals of the corporations may carry the additional
impetus of government policy.  For the Americans, at least, this is bad
news, but then international readers should note that the players in the
deal noted in the announcement are American corporations seeking to create
"defacto" standards for an international resource: the Internet.  Your only
vote is your money.

What can be done?  The public doesn't understand anything beyond a
formatted page.

You are now in a features war in which the only hope for survival is to
create apps which share features that make their interoperation possible
and free of licensed encumbrance.  This does not mean that you cannot
compete or make profit.  Seek stable ground on which to build your products
that provides cover and allies.  Freeware hacking is not the way although
the freeware hackers should consider joining the international standards
groups lest they become the *experimental fringe*.  This is not to
denigrate their work but to insist their work is positive and should not be
subject to a media whose opinions are all too often formed by the sources
of their revenue, which then, effectively control the market opinions.

For the SGML community of users and vendors, it is time to focus on
implementing the emerging family of standards and to ensure that your
applications are clearly superior in functionality.  Superior functionality
emerges when such apps operate as a suite: not a standalone browser.

Merely formatting the legacy of yesterday will not do the trick.  Some SGML
vendors are already compromised because they have chosen to walk both sides
of the fence on these issues to increase market share.  Only time will tell
if this tactic has been effective.  The users can offer opinions, but the
money will be spent in the offices of upper management and the denizens of
these will find Clark and Warnock's arguments convincing.

For the rest of us who are developers and users of SGML applications, it is
time to move away from the book publishers and app developers who insist
that preserving the book legacy is the only way to advance electronic
publishing.  This is already doable with SGML and has been done again and
again while SGMLists and others quibble over  vs <tit>.  Time to
look beyond the book metaphor to what is possible with SGML that cannot be
done with Acrobat.

They'll fix the bugs, Jeff.  They'll give away the fonts if they have to.
They have a very large and impressive suite of apps that will be made to
interoperate. They will crush or co-opt VRML and any other sensory mode
notation that emerges.  A larger definition of what a document is and can
be must be adopted.  Multi-modal documents are the future.  SGML is the
cleanest most open standard for this.

Len Bullard


From rballard@cnj.digex.net Mon Mar 31 15:06:24 1995
Status: RO
X-Status: 
Path: news3.digex.net!news2.digex.net!howland.reston.ans.net!Germany.EU.net!EU.net!Belgium.EU.net!news
</PRE>