The Metaphors of the Net

The Metaphors of the Net




I. The Genetic Blueprint

A decade after the invention of the World Wide Web, Tim Berners-Lee is promoting the “Semantic Web”. The Internet hitherto is a repository of digital content. It has a rudimentary inventory system and very crude data location sets. As a sad consequence, most of the content is invisible and far away. additionally, the Internet manipulates strings of signs, not logical or semantic propositions. In other words, the Net compares values but does not know the meaning of the values it consequently manipulates. It is unable to interpret strings, to infer new facts, to deduce, generate, origin, or otherwise comprehend what it is doing. In short, it does not understand language. Run an ambiguous term by any search engine and these shortcomings become painfully apparent. This without of understanding of the semantic foundations of its raw material (data, information) prevent applications and databases from sharing resources and feeding each other. The Internet is discrete, not continuous. It resembles an archipelago, with users hopping from island to island in a frantic search for relevancy.

already visionaries like Berners-Lee do not contemplate an “intelligent Web”. They are simply proposing to let users, content creators, and web developers assign descriptive meta-tags (“name of hotel”) to fields, or to strings of signs (“Hilton”). These meta-tags (arranged in semantic and relational “ontologies” – lists of metatags, their meanings and how they relate to each other) will be read by various applications and allow them to course of action the associated strings of signs correctly (place the information “Hilton” in your address book under “hotels”). This will make information retrieval more efficient and reliable and the information retrieved is bound to be more applicable and amenable to higher level processing (statistics, the development of heuristic rules, etc.). The shift is from HTML (whose tags are concerned with visual appearances and content indexing) to languages such as the DARPA Agent Markup Language, OIL (Ontology Inference inner or Ontology Interchange Language), or already XML (whose tags are concerned with content taxonomy, document structure, and semantics). This would bring the Internet closer to the typical library card catalogue.

already in its current, pre-semantic, hyperlink-dependent, phase, the Internet brings to mind Richard Dawkins’ seminal work “The Selfish Gene” (OUP, 1976). This would be doubly true for the Semantic Web.

Dawkins suggested to generalize the rule of natural selection to a law of the survival of the stable. “A stable thing is a collection of atoms which is long-lasting enough or shared enough to deserve a name”. He then proceeded to describe the emergence of “Replicators” – molecules which produced copies of themselves. The Replicators that survived in the competition for scarce raw materials were characterized by high longevity, fecundity, and copying-fidelity. Replicators (now known as “genes”) constructed “survival machines” (organisms) to protect them from the vagaries of an ever-harsher ecosystem.

This is very reminiscent of the Internet. The “stable things” are HTML coded web pages. They are replicators – they create copies of themselves every time their “web address” (URL) is clicked. The HTML coding of a web page can be thought of as “genetic material”. It contains all the information needed to copy the page. And, exactly as in character, the higher the longevity, fecundity (measured in links to the web page from other web sites), and copying-fidelity of the HTML code – the higher its chances to survive (as a web page).

Replicator molecules (DNA) and replicator HTML have one thing in shared – they are both packaged information. In the appropriate context (the right biochemical “soup” in the case of DNA, the right software application in the case of HTML code) – this information generates a “survival machine” (organism, or a web page).

The Semantic Web will only increase the longevity, fecundity, and copying-fidelity or the inner code (in this case, OIL or XML instead of HTML). By easing many more interactions with many other web pages and databases – the inner “replicator” code will ensure the “survival” of “its” web page (=its survival machine). In this analogy, the web page’s “DNA” (its OIL or XML code) contains “single genes” (semantic meta-tags). The whole course of action of life is the unfolding of a kind of Semantic Web.

In a predictive use, Dawkins described the Internet:

“The first thing to grasp about a modern replicator is that it is highly gregarious. A survival machine is a means containing not just one gene but many thousands. The manufacture of a body is a cooperative venture of such intricacy that it is almost impossible to disentangle the contribution of one gene from that of another. A given gene will have many different effects on quite different parts of the body. A given part of the body will be influenced by many genes and the effect of any one gene depends on interaction with many others…In terms of the analogy, any given page of the plans makes reference to many different parts of the building; and each page makes sense only in terms of cross-reference to numerous other pages.”

What Dawkins neglected in his important work is the concept of the Network. People congregate in cities, mate, and copy, consequently providing genes with new “survival machines”. But Dawkins himself suggested that the new Replicator is the “meme” – an idea, belief, technique, technology, work of art, or bit of information. Memes use human brains as “survival machines” and they hop from brain to brain and across time and space (“communications”) in the time of action of cultural (as definite from biological) evolution. The Internet is a latter day meme-hopping playground. But, more importantly, it is a Network. Genes move from one container to another by a linear, serial, monotonous course of action which involves prolonged periods of one on one gene shuffling (“sex”) and gestation. Memes use networks. Their propagation is, consequently, similar, fast, and all-pervasive. The Internet is a manifestation of the growing predominance of memes over genes. And the Semantic Web may be to the Internet what Artificial Intelligence is to typical computing. We may be on the threshold of a self-aware Web.

2. The Internet as a disorganized Library

A. The Problem of Cataloguing

The Internet is an assortment of billions of pages which contain information. Some of them are visible and others are generated from hidden databases by users’ requests (“Invisible Internet”).

The Internet displays no discernible order, classification, or categorization. Amazingly, as opposed to “classical” libraries, no one has however invented a (sorely needed) Internet cataloguing standard (remember Dewey?). Some sites indeed apply the Dewey Decimal System to their contents (Suite101). Others default to a directory structure (Open Directory, Yahoo!, Look Smart and others).

Had such a standard existed (an agreed upon numerical cataloguing method) – each site could have self-classified. Sites would have an interest to do so to increase their visibility. This, naturally, would have deleted the need for today’s clunky, incomplete and (highly) inefficient search engines.

consequently, a site whose number starts with 900 will be closest identified as dealing with history and multiple classification will be promoted to allow finer cross-sections to appear. An example of such an emerging technology of “self classification” and “self-publication” (though limited to scholarly resources) is the “Academic Resource Channel” by Scindex.

additionally, users will not be required to remember reams of numbers. Future browsers will be akin to catalogues, very much like the applications used in modern day libraries. Compare this utopia to the current dystopy. Users struggle with mounds of irrelevant material to finally reach a uncompletely and disappointing destination. At the same time, there likely are web sites which exactly match the poor user’s needs. however, what currently determines the chances of a happy encounter between user and content – are the whims of the specific search engine used and things like meta-tags, headlines, a fee paid, or the right opening sentences.

B. Screen vs. Page

The computer screen, because of physical limitations (size, the fact that it has to be scrolled) fails to effectively compete with the printed page. The latter is nevertheless the most clever medium however invented for the storage and release of textual information. Granted: a computer screen is better at highlighting discrete units of information. So, these differing capacities draw the battle lines: structures (printed pages) versus units (screen), the continuous and easily reversible (print) versus the discrete (screen).

The solution lies in finding an efficient way to translate computer screens to printed matter. It is hard to believe, but no such thing exists. Computer screens are nevertheless hostile to off-line printing. In other words: if a user copies information from the Internet to his information processor (or vice versa, for that matter) – he ends up with a fragmented, garbage-filled and non-aesthetic document.

Very few site developers try to do something about it – already fewer succeed.

C. Dynamic vs. Static Interactions

One of the biggest mistakes of content suppliers is that they do not provide a “static-dynamic interaction”.

Internet-based content can now easily interact with other media (e.g., CD-ROMs) and with non-PC platforms (PDA’s, mobile phones).

Examples abound:

A CD-ROM shopping catalogue interacts with a Web site to allow the user to order a product. The catalogue could also be updated by the site (as is the practice with CD-ROM encyclopedias). The advantages of the CD-ROM are clear: very fast access time (dozens of times faster than the access to a Web site using a dial up connection) and a data storage capacity hundreds of times bigger than the average Web page.

Another example:

A PDA plug-in disposable chip containing hundreds of advertisements or a “yellow pages”. The consumer selects the ad or entry that she wants to see and connects to the Internet to view a applicable video. She could then also have an interactive chat (or a conference) with a salesperson, receive information about the company, about the ad, about the advertising agency which produced the ad – and so on.

CD-ROM based encyclopedias (such as the Britannica, or the Encarta) already contain hyperlinks which carry the user to sites chosen by an Editorial Board.

observe

CD-ROMs are probably a doomed medium. Storage capacity continually increases exponentially and, within a year, desktops with 80 Gb hard disks will be a shared sight. additionally, the much heralded Network Computer – the stripped down version of the personal computer – will put at the disposal of the average user terabytes in storage capacity and the processing strength of a supercomputer. What separates computer users from this utopia is the communication bandwidth. With the introduction of radio and satellite broadband sets, DSL and ADSL, cable modems coupled with progressive compression standards – video (on need), audio and data will be obtainable speedily and plentifully.

The CD-ROM, however, is not mobile. It requires installation and the utilization of complex hardware and software. This is no user friendly push technology. It is nerd-oriented. As a consequence, CD-ROMs are not an immediate medium. There is a long time lapse between the moment of buy and the moment the user accesses the data. Compare this to a book or a magazine. Data in these oldest of media is immediately obtainable to the user and they allow for easy and accurate “back” and “forward” roles.

Perhaps the biggest mistake of CD-ROM manufacturers has been their inability to offer an integrated hardware and software package. CD-ROMs are not compact. A Walkman is a compact hardware-cum-software package. It is easily transportable, it is thin, it contains numerous, user-friendly, complex roles, it provides immediate access to data. So does the discman, or the MP3-man, or the new generation of e-books (e.g., E-Ink’s). This cannot be said about the CD-ROM. By tying its future to the out of use concept of stand-alone, expensive, inefficient and technologically unreliable personal computers – CD-ROMs have sentenced themselves to oblivion (with the possible exception of reference material).

D. Online Reference

A visit to the on-line Encyclopaedia Britannica demonstrates some of the tremendous, mind boggling possibilities of online reference – in addition as some of the obstacles.

Each entry in this huge work of reference is hyperlinked to applicable Web sites. The sites are carefully screened. Links are obtainable to data in various forms, including audio and video. Everything can be copied to the hard disk or to a R/W CD.

This is a new conception of a knowledge centre – not just a heap of material. The content is modular and continuously enriched. It can be connected to a voice Q&A centre. Queries by subscribers can be answered by e-mail, by fax, posted on the site, hard copies can be sent by post. This “unimportant Pursuit” or “homework” service could be very popular – there is important appetite for “Just in Time Information”. The Library of Congress – together with a few other libraries – is in the time of action of making just such a service obtainable to the public (CDRS – Collaborative Digital Reference Service).

E. Derivative Content

The Internet is an enormous reservoir of archives of freely easy to reach, or already public domain, information.

With a minimal investment, this information can be gathered into logical, theme oriented, cheap compilations (on CD-ROMs, print, e-books or other media).

F. E-Publishing

The Internet is by far the world’s largest publishing platform. It incorporates FAQs (Q&A’s regarding almost every technical matter in the world), e-zines (electronic magazines), the electronic versions of print dailies and periodicals (in conjunction with on-line news and information sets), reference material, e-books, monographs, articles, minutes of discussions (“threads”), conference proceedings, and much more besides.

The Internet represents major advantages to publishers. Consider the electronic version of a p-zine.

Publishing an e-zine contributes the sales of the printed edition, it helps sign on subscribers and it leads to the sale of advertising space. The electronic archive function (see next section) saves the need to file back issues, the physical space required to do so and the irritating search for data items.

The future trend is a combined subscription to both the electronic edition (mainly for the archival value and the ability to hyperlink to additional information) and to the print one (easier to browse the current issue). The Economist is already offering free access to its electronic archives as an inducement to its print subscribers.

The electronic daily presents other advantages:

It allows for immediate feedback and for flowing, almost real-time, communication between writers and readers. The electronic version, consequently, acquires a gyroscopic function: a navigation instrument, always suggesting deviations from the “right” course. The content can be immediately updated and breaking news incorporated in older content.

Specialty hand held devices already allow for downloading and storage of great quantities of data (up to 4000 print pages). The user gains access to libraries containing hundreds of texts, alternation to be downloaded, stored and read by the specific device. Again, a convergence of standards is to be expected in this field in addition (the final contenders will probably be Adobe’s PDF against Microsoft’s MS-Reader).

Currently, e-books are dichotomously treated either as:

Continuation of print books (p-books) by other method, or as a whole new publishing universe.

Since p-books are a more functional medium then e-books – they will prevail in any straightforward “medium substitute” or “medium displacement” battle.

In other words, if publishers will persist in the simple and straightforward conversion of p-books to e-books – then e-books are doomed. They are simply inferior and cannot offer the comfort, tactile delights, browseability and scanability of p-books.

But e-books – being digital – open up a vista of hitherto neglected possibilities. These will only be enhanced and enriched by the introduction of e-paper and e-ink. Among them:

  • Hyperlinks within the e-book and without it – to web content, reference works, etc.;
  • encased moment shopping and ordering links;
  • Divergent, user-interactive, decision pushed plotlines;
  • Interaction with other e-books (using a wireless standard) – collaborative authoring or reading groups;
  • Interaction with other e-books – gaming and community activities;
  • Automatically or regularly updated content;
  • Multimedia;
  • Database, Favourites, Annotations, and History Maintenance (archival records of reading habits, shopping habits, interaction with other readers, plot related decisions and much more);
  • Automatic and encased audio conversion and translation capabilities;
  • complete wireless piconetworking and scatternetworking capabilities.
  • The technology is nevertheless not fully there. Wars rage in both the wireless and the e-book realms. Platforms compete. Standards clash. Gurus argue. But convergence is unavoidable and with it the e-book of the future.

G. The Archive Function

The Internet is also the world’s biggest cemetery: tens of thousands of deadbeat sites, nevertheless easy to reach – the “Ghost Sites” of this electronic frontier.

This, in a way, is collective memory. One of the Internet’s main roles will be to preserve and move knowledge by time. It is called “memory” in biology – and “archive” in library science. The history of the Internet is being proven by search engines (Google) and specialized sets (Alexa) alike.

3. The Internet as a Collective Nervous System

Drawing a comparison from the development of a human infant – the human race has just commenced to develop its neural system.

The Internet fulfils all the roles of the Nervous System in the body and is, both functionally and structurally, pretty similar. It is decentralized, redundant (each part can serve as functional backup in case of malfunction). It hosts information which is easy to reach by various paths, it contains a memory function, it is multimodal (multimedia – textual, visual, audio and animation).

I believe that the comparison is not shallow and that studying the roles of the brain (from beginning to adulthood) is likely to discarded light on the future of the Net itself. The Net – exactly like the nervous system – provides pathways for the transport of goods and sets – but also of memes and information, their processing, modeling, and integration.

A. The Collective Computer

Carrying the metaphor of “a collective brain” further, we would expect the processing of information to take place on the Internet, instead of inside the end-user’s hardware (the same way that information is processed in the brain, not in the eyes). Desktops will receive results and communicate with the Net to receive additional clarifications and instructions and to convey information gathered from their ecosystem (mostly, from the user).

Put differently:

In future, servers will contain not only information (as they do today) – but also software applications. The user of an application will not be forced to buy it. He will not be pushed into hardware-related expenditures to adjust to the ever growing size of applications. He will not find himself wasting his scarce memory and computing resources on passive storage. Instead, he will use a browser to call a central computer. This computer will contain the needed software, broken to its elements (=applets, small applications). Anytime the user wishes to use one of the roles of the application, he will siphon it off the central computer. When finished – he will “return” it. Processing speeds and response times will be such that the user will not feel at all that he is not interacting with his own software (the question of ownership will be very clouded). This technology is obtainable and it provoked a heated debated about the future shape of the computing industry as a whole (desktops – really strength packs – or network computers, a simply dumb terminals). Access to online applications are already offered to corporate users by ASPs (Application Service Providers).

In the last few years, scientists have harnessed the combined strength of online PC’s to perform astonishing feats of distributed similar processing. Millions of PCs connected to the net co-course of action signals from outer space, meteorological data, and solve complicate equations. This is a chief example of a collective brain in action.

B. The Intranet – a Logical Extension of the Collective Computer

LANs (Local Area Networks) are no longer a rarity in corporate offices. WANs (wide Area Networks) are used to connect geographically distributed organs of the same legal entity (branches of a bank, daughter companies of a conglomerate, a sales force). Many LANs and WANs are going wireless.

The wireless intranet/extranet and LANs are the wave of the future. They will little by little eliminate their fixed line counterparts. The Internet offers equal, platform-independent, location-independent and time of day – independent access to corporate memory and nervous system. complex firewall security applications protect the privacy and confidentiality of the intranet from all but the most determined and savvy crackers.

The Intranet is an inter-organizational communication network, constructed on the platform of the Internet and it, consequently, enjoys all its advantages. The extranet is open to clients and suppliers in addition.

The company’s server can be accessed by anyone empowered, from anywhere, at any time (with local – instead of international – communication costs). The user can leave messages (internal e-mail or v-mail), access information – proprietary or public – from it, and participate in “virtual teamwork” (see next chapter).

The development of measures to safeguard server routed inter-organizational communication (firewalls) is the solution to one of two obstacles to the institutionalization of Intranets. The second problem is the limited bandwidth which does not permit the efficient move of audio (not to mention video).

It is difficult to conduct video conferencing by the Internet. already the voices of discussants who use internet phones (IP telephony) come out (though very slightly) distorted.

All this did not prevent 95% of the Fortune 1000 from installing intranet. 82% of the rest intend to install one by the end of this year. Medium to big size American firms have 50-100 intranet terminals per every internet one.

One of the greatest advantages of the intranet is the ability to move documents between the various parts of an organization. Consider Visa: it pushed 2 million documents per day internally in 1996.

An organization equipped with an intranet can (while protected by firewalls) give its clients or suppliers access to non-classified harmonies, or inventory systems. Many B2B exchanges and industry-specific purchasing management systems are based on extranets.

C. The Transport of Information – Mail and Chat

The Internet (its e-mail function) is eroding traditional mail. 90% of customers with on-line access use e-mail now and then and 60% work with it regularly. More than 2 billion messages traverse the internet daily.

E-mail applications are obtainable as freeware and are included in all browsers. consequently, the Internet has completely assimilated what used to be a separate service, to the extent that many people make the mistake of thinking that e-mail is a characterize of the Internet.

The internet will do to phone calls what it has done to mail. Already there are applications (Intel’s, Vocaltec’s, Net2Phone) which permit the user to conduct a phone conversation by his computer. The voice quality has improved. The discussants can cut into each others words, argue and listen to tonal nuances. Today, the parties (two or more) engaging in the conversation must possess the same software and the same (computer) hardware. In the very near future, computer-to-regular phone applications will eliminate this requirement. And, again, at the same time multi-modality: the user can talk over the phone, see his party, send e-mail, receive messages and move documents – without obstructing the flow of the conversation.

The cost of transferring voice will become so negligible that free voice traffic is conceivable in 3-5 years. Data traffic will overtake voice traffic by a wide margin.

The next phase will probably include virtual reality. Each of the parties will be represented by an “avatar”, a 3-D figurine generated by the application (or the user’s likeness mapped and superimposed on the the avatar). These figurines will be multi-dimensional: they will possess their own communication patterns, special habits, history, preferences – in short: their own “personality”.

consequently, they will be able to continue an “identity” and a consistent pattern of communication which they will develop over time.

Such a figure could great number a site, accept, welcome and guide visitors, all the time bearing their preferences in its electronic “mind”. It could relate the news, like the digital keep up in a place “Ananova” does. Visiting sites in the future is bound to be a much more pleasant affair.

D. The Transport of Value – E-cash

In 1996, four corporate giants (Visa, MasterCard, Netscape and Microsoft) agreed on a standard for effecting obtain payments by the Internet: SET. Internet commerce is supposed to mushroom to $25 billion by 2003. Site owners will be able to collect rent from passing visitors – or fees for sets provided within the site. Amazon instituted an honour system to collect donations from visitors. PayPal provides millions of users with cash substitutes. little by little, the Internet will compete with central edges and banking systems in money creation and move.

E. The Transport of Interactions – The Virtual Organization

The Internet allows for at the same time communication and the efficient move of multimedia (video included) files between an unlimited number of users. This opens up a vista of mind boggling opportunities which are the real chief of the Internet dramatical change: the virtual collaborative (“Follow the Sun”) modes.

Examples:

A group of musicians is able to compose music or play it – while spatially and temporally separated;

Advertising agencies are able to co-produce ad campaigns in a real time interaction;

Cinema and TV films are produced from disparate geographical spots by the teamwork of people who never meet, except by the Net.

These examples illustrate the concept of the “virtual community”. Space and time will no longer hinder team collaboration, be it scientific, artistic, cultural, or an ad hoc arrangement for the provision of a service (a virtual law firm, or accounting office, or a virtual consultancy network). The intranet can also be thought of as a “virtual organization”, or a “virtual business”.

The virtual mall and the virtual catalogue are chief examples of spatial and time related liberation.

In 1998, there were well over 300 active virtual malls on the Internet. In 2000, they were frequented by 46 million shoppers, who shopped in them for goods and sets.

The virtual mall is an Internet “space” (pages) wherein “shops” are located. These shops offer their wares using visual, audio and textual method. The visitor passes by a virtual “gate” or storefront and examines the merchandise on offer, until he reaches a buying decision. Then he engages in a feedback course of action: he pays (with a credit card), buys the product, and waits for it to arrive by mail (or downloads it).

The manufacturers of digital products (intellectual character such as e-books or software) have begun selling their merchandise on-line, as file downloads. however, slow communications speeds, competing file formats and reader standards, and limited bandwidth – constrain the growth possible of this mode of sale. Once resolved – intellectual character will be sold directly from the Net, on-line. Until such time, the mediation of the Post Office is nevertheless required. As long as this is the state of the art, the virtual mall is nothing but a glorified computerized mail catalogue or Buying Channel, the only difference being the exceptionally varied inventory.

Websites which started as “specialty stores” are fast transforming themselves into multi-purpose virtual malls. Amazon.com, for example, has bought into a virtual pharmacy and into other virtual businesses. It is now selling music, video, electronics and many other products. It started as a bookstore.

This contrasts with a much more creative idea: the virtual catalogue. It is a form of narrowcasting (as opposed to broadcasting): a surgically accurate targeting of possible consumer audiences. Each group of profiled consumers (no matter how small) is fitted with their own – digitally generated – catalogue. This is updated daily: the variety of wares on offer (modificated to mirror inventory levels, consumer preferences, and goods in transit) – and prices (sales, discounts, package deals) change in real time. Amazon has incorporated many of these features on its web site. The user enters its web site and there delineates his consumption profile and his preferences. A customized catalogue is closest generated for him including specific recommendations. The history of his purchases, preferences and responses to feedback questionnaires is accumulated in a database. This intellectual character may well be Amazon’s main asset.

There is no technological obstacles to implementing this vision today – only administrative and legal (patent) ones. Big brick and mortar retail stores are not up to processing the flood of data expected to consequence. They also keep highly sceptical regarding the feasibility of the new medium. And privacy issues prevent data mining or the effective collection and usage of personal data (remember the case of Amazon’s “Readers’ Circles”).

The virtual catalogue is a private case of a new internet off-shoot: the “smart (shopping) agents”. These are AI applications with “long memories”.

They draw detailed profiles of consumers and users and then suggest purchases and refer to the appropriate sites, catalogues, or virtual malls.

They also provide price comparisons and the new generation cannot be confined or fooled by using differing product categories.

In the future, these agents will cover also brick and mortar retail chains and, in conjunction with wireless, location-specific sets, issue a map of the branch or store closest to an address stated by the user (the default being his residence), or yielded by his GPS enabled wireless mobile or PDA. This technology can be seen in action in a few music sites on the web and is likely to be principal with wireless internet appliances. The owner of an internet enabled (third generation) mobile phone is likely to be the target of geographically-specific marketing campaigns, ads and special offers pertaining to his current location (as reported by his GPS – satellite Geographic Positioning System).

F. The Transport of Information – Internet News

Internet news are advantaged. They are frequently and dynamically updated (unlike static print news) and are always easy to reach (similar to print news), immediate and fresh.

The future will observe a form of interactive news. A special “corner” in the news Web site will adjust to “breaking news” posted by members of the the public (or corporate press releases). This will provide readers with a glimpse into the making of the news, the raw material news are made of. The same technology will be applied to interactive TVs. Content will be downloaded from the internet and displayed as an overlay on the TV screen or in a box in it. The contents downloaded will be directly connected to the TV programming. consequently, the biography and track record of a football player will be displayed during a football match and the history of a country when it gets news coverage.

4. Terra Internetica – Internet, an Unknown Continent

Laymen and experts alike talk about “sites” and “advertising space”. however, the Internet was never compared to a new continent whose surface is infinite.

The Internet has its own real estate developers and construction companies. The real life equivalents origin their profits from the shortagen of the resource that they adventure – the Internet counterparts origin their profits from the tenants (content producers and distributors, e-tailers, and others).

Entrepreneurs bought “Internet Space” (pages, domain names, portals) and leveraged their acquisition commercially by:

  • Renting space out;
  • Constructing infrastructure on their character and selling it;
  • Providing an intelligent gateway, entry point (portal) to the rest of the internet;
  • Selling advertising space which subsidizes the tenants (Yahoo!-Geocities, Tripod and others);
  • Cybersquatting (purchasing specific domain names identical to brand names in the “real” world) and then selling the domain name to an interested party.
  • Internet Space can be easily purchased or produced. The investment is low and getting lower with the introduction of competition in the field of domain registration sets and the increase in the number of top domains.

Then, infrastructure can be erected – for a shopping mall, for free home pages, for a portal, or for another purpose. It is precisely this infrastructure that the developer can later sell, lease, franchise, or rent out.

But this real estate bubble was the culmination of a long and tortuous course of action.

At the beginning, only members of the fringes and the avant-garde (inventors, risk assuming entrepreneurs, gamblers) invest in a new invention. No one knows to say what are the optimal uses of the invention (in other words, what is its future). Many – mostly members of the scientific and business elites – argue that there is no real need for the invention and that it substitutes a new and untried way for old and tried modes of doing the same things (so why assume the risk of investing in the unknown and the untried?).

additionally, these criticisms are usually well-established.

To start with, there is, indeed, no need for the new medium. A new medium invents itself – and the need for it. It also generates its own market to satisfy this newly found need.

Two chief examples of this self-recursive course of action are the personal computer and the compact disc.

When the PC was invented, its uses were completely unclear. Its performance was lacking, its abilities limited, it was unbearably user unfriendly. It suffered from faulty design, was absent any user comfort and ease of use and required important specialized knowledge to function. The worst part was that this knowledge was exclusive to the new invention (not portable). It reduced labour mobility and limited one’s specialized horizons. There were many gripes among workers stated to tame the new beast. Managers regarded it at best as a nuisance.

The PC was thought of, at the beginning, as a complex gaming machine, an electronic baby-sitter. It included a keyboard, so it was thought of in terms of a glorified typewriter or spreadsheet. It was used mainly as a information processor (and the outlay justified solely on these grounds). The spreadsheet was the first real PC application and it demonstrated the advantages inherent to this new machine (mainly flexibility and speed). nevertheless, it was more of the same. A speedier sliding ruler. After all, said the unconvinced, what was the difference between this and a hand held calculator (some of them already had computing, memory and programming features)?

The PC was recognized as a medium only 30 years after it was invented with the introduction of multimedia software. All this time, the computer continued to spin off markets and secondary markets, needs and specialized specialties. The talk as always was centred on how to enhance on existing markets and solutions.

The Internet is the computer’s first important application. Hitherto the computer was only quantitatively different to other computing or gaming devices. Multimedia and the Internet have made it qualitatively superior, sui generis, rare.

Part of the problem was that the Internet was invented, is maintained and is operated by computer professionals. For decades these people have been conditioned to think in Olympic terms: faster, stronger, higher – not in terms of the new, the unheard of, or the non-existent. Engineers are trained to enhance – seldom to invent. With few exceptions, its creators stumbled across the Internet – it invented itself despite them.

Computer professionals (hardware and software experts alike) – are linear thinkers. The Internet is non linear and modular.

It is nevertheless the age of hackers. There is nevertheless a lot to be done in improving technological prowess and powers. But their control of the contents is waning and they are being little by little replaced by communicators, creative people, advertising executives, psychologists, venture capitalists, and the totally unpredictable masses who flock to flaunt their home pages and graphomania.

These all are attuned to the user, his mental needs and his information and entertainment preferences.

The compact disc is a different tale. It was deliberately invented to enhance upon an existing technology (basically, Edison’s Gramophone). Market-wise, this was a major gamble. The improvement was, at first, debatable (many said that the sound quality of the first generation of compact discs was inferior to that of its contemporaneous record players). Consumers had to be convinced to change both software and hardware and to dish out thousands of dollars just to listen to what the manufacturers claimed was more a authentically reproduced sound. A better argument was the longer life of the software (though when contrasted with the limited life expectancy of the consumer, some of the first sales pitches sounded absolutely morbid).

The computer suffered from unclear positioning. The compact disc was very clear as to its main roles – but had a rough time convincing the consumers that it was needed.

Every medium is first controlled by the technical people. Gutenberg was a printer – not a publisher. however, he is the world’s most famous publisher. The technical cadre is joined by dubious or small-extent entrepreneurs and, together, they establish ventures with no clear vision, market-oriented thinking, or orderly plan of action. The legislator is also dumbfounded and does not grasp what is happening – consequently, there is no legislation to control the use of the medium. observe the initial confusion concerning copyrighted vs. licenced software, e-books, and the copyrights of ROM encased software. Abuse or under-utilization of resources grow. The sale of radio frequencies to the first cellular phone operators in the West – a situation which repeats itself in Eastern and Central Europe nowadays – is an example.

But then more complicate transactions – exactly as in real estate in “real life” – begin to appear. The Internet is likely to converge with “real life”. It is likely to be dominated by brick and mortar entities which are likely to import their business methods and management. As its eccentric past (the dot.com expansion and the dot.bomb bust) recedes – a sustainable and profitable future awaits it.




leave your comment

Search

Top