(You can download a .ZIP file containing an up-to-date version of these files)
The Internet is actually a child of the Cold War. Faced with the prospect of a Nuclear war, a resilient network was required such that, should one communications centre be destroyed, messages would automatically re-route themselves to ensure they could still get from A to B.
This led to ARPANET which became an American Universities network, and ultimately the Internet of today.
In the early days, sending information via this network was neither particularly rapid nor easy. At this time you had to be fairly technically minded and working at a university or major research to use the Net.
But over time, more and more useful utilities and tools were being developed, and as computer hardware dropped in price and commerce became interested, the Net began to spread, and really hasn't stopped since.
The "half-life" of the Internet is around 10 months, that is 10 months ago it will have been half the size it is now. This growth has been going on for years now, and still looks to continue until, in a few years from now, almost every computer in the world will access the Net. This process itself is encouraging computer ownership, to the extent that "getting on the Internet" is now a major reason for purchasing a computer for home use.
For a brief summary of the Internet's growth, visit http://www.nw.com/zone/host-count-history or http://www.weyrich.com/web_business/www_history.html
For surveys of who is using the web and how, visit http://www.gvu.gatech.edu/user_surveys/
The real breakthrough for the Internet was the invention of the World Wide Web with the introduction of the Mosaic hypertext browser developed at CERN.
This coincided with affordable graphics-capable home computers and almost overnight the Net went from a text-based Nerd's paradise, to the user friendly click and point world we know today.
Did I say user friendly? Well almost... read on...
The Internet works by breaking up information into packets of data. Each packet of data is given an address and sent off on its merry way. When the packets are received at the other end, they are reassembled to give a faithful copy of the original data.
The Internet consists of a network of computers all passing messages to and fro. Each packet gets passed by a machine to its neighbours which then decide to pass it on, or pass it back. This trial and error approach is both the Internet's greatest strength and weakest.
It's a strength because if a Internet node goes down (and this happens even without nuclear strikes), the messages simply divert round the missing node. This may mean taking a detour via a satellite link over the Indian Ocean, or travelling via optical fibre via America. It really doesn't matter to you, the Internet sorts it all out.
It's a weakness because you usually need all your packets to reassemble the original message, and if one takes a detour this may delay your whole message. If one gets lost, it will usually prevent you getting the rest of the message.
You can see this in a browser sometimes when a link stalls for no good reason and then after a while carries on. Most likely one of your packets just took the tourist route.
In summary, the Internet is flexible, but may as a result not be all that fast.
Each internet node is given its own Internet Protocol or IP address. This is a series of four numbers such as
and is unique. Most internet nodes also choose to make this number correspond to a Domain name, that is a text name that is more comprehensible to users. The translation between names and IP addresses is performed by Domain Name Servers, and failure to convert a name to an ID gives the dreaded DNS lookup error or error 404.
Domain names form part of your email addresses, web addresses, ftp addresses etc.
In the case of email, the domain name is that part of the email address after the "@".
The part of your email address before the "@" depends on the administration of email on your Internet node.
The way it works is broadly speaking as follows. The IP address space (in the form nnn.nnn.nnn.nnn, all numbers) is carved up into smaller chunks, which are then administered by different organisations. This carving up is all done behind the scenes by some IP committee somewhere.
In the early days reserving names of well known companies used to be quite lucrative, as the named company would eventually have to buy the name off you. There is still some competition for "good names".
In order for 188.8.131.52 to be understood as www.yrl.co.uk, someone on the Internet backbone has to act as a domain name server, i.e. they know how to match a name with an ID.
These organisations, often ISPs, can sub-let parts of domain space, and arrange the allocation within their own space.
Thus extensions like .ac.uk (uk academic establishments) are run by appropriate bodies. Each body is free to charge for this service, and in the commercial areas there is usually a fee in the £100/year region.
There may be further rules within each domain. For example an .ac.uk domain name won't be allowed unless you're a recognised university.
Domain names are similar to postal addresses, in that the last part of the address is most likely to be familiar to you. Each domain name reads roughly as follows
[machine name].[organisation name].[domain type]
Reading this backwards we have the domain type which is always present. The main domain types include
.com - commercial .org - organisation .edu - education .net - network provider .mil - military
Since the Internet evolved in America, most of these apply mostly to American (or multi-national) organisations. Other countries append a (usually) 2-letter country code, and most reproduce the above structure in some way. Thus in the UK we have
.co.uk - UK commercial .ac.uk - UK academic .org.uk - UK organisations
Other countries have slightly different organisations.
In front of the domain type is the organisation name. This must be present and unique within the domain type. Thus "microsoft.co.uk" and "microsoft.com" are both allowed (and both exist).
In front of the organisation name is the machine name. For Web access this is commonly just "www". Thus the web address of Microsoft is
indicating that this is the web access for the American company Microsoft. Knowing this naming structure you allows you to frequently guess correctly what the URL of a desired site might be.
- The ruling body for domain names is, or rather was, the International Ad Hoc Committee. (This body was dissolved on May 1, 1997 and now refers you to http://www.gtld-mou.org)
- A full list of existing domains can be found at http://www.itu.int/intreg/dns.html
- A list survey of domains and how many hosts they have can be found at http://www.nw.com/zone/WWW/dist-byname.html where this analysis is repeated every 6 months.
- Plans are under way to introduce new domains e.g. .firm, .nom etc. See the discussion at http://www.netfact.com/iahc/
- There are some sites that will translate domain names into real company names, often as an aid to choosing your own domain name and making sure it doesn't conflict with an existing name. See http://www.namesnet.co.uk/ for example
Using the Internet as a communications network an increasing number of protocols have evolved to allow different types of information to be distributed. Many of these services were originally text-based and unfriendly to use. As such, they have been superseded by the Web, but we mention them here for completeness.
Email is one of the earliest and still popular and enduring uses made of the Net. Reflecting this, there are a large number of free and commercial email packages of increasing sophistication.
Email is almost enough justification in itself for seeking Internet access, especially since these days all sorts of information can be sent this way.
See the chapter on Email for a fuller discussion.
FTP is a means by which users can transfer files between computers. A typical use of FTP is to set up a file server onto which users can login and download software and other files.
Although increasingly integrated with Web browser software, there are still a number of very useful FTP sites that cannot be accessed by "anonymous login" See http://www.eff.org/papers/eegtti/eeg_137.html for details on how to use FTP outside a browser.
Used inside a browser, FTP appears just like a directory listing, and you can simply navigate up and down the directory tree.
Useful sites include RTFM which lists all the FAQ's by usenet group, for example
contains the tiddlywinks FAQ (posted monthly).
Also sites like
can help you find the FTP site that holds the resource you're looking for.
Telnet allows you to log-in to another machine over the Internet.
For more details see http://www.eff.org/papers/eegtti/eeg_93.html
Usenet or News is a set (21,000 and rising) of public discussion groups. Messages are posted to the newsgroup, and distributed to all who read the group.
These are an unrivalled means of communication with your peers and a rich source of information. They are discussed fully in the chapter on News and Usenet
Internet Relay Chat allows you to "chat" interactively with one or more people. To do this you need IRC software installed, and you need to join a "discussion room". These rooms are arranged by subject, so in principle you should be able to find someone to talk to about something.
For more details see http://www.eff.org/papers/eegtti/eeg_230.html
The HyperText Transfer Protocol (HTTP) marks the arrival of the Web proper. It is discussed fully in the chapters on Browsers and Creating your own web pages.
Other services of less interest here include
Software is already starting to appear that will search the Net intelligently sniffing out information that you want. It's difficult to see where this will lead, but an often quoted example is that software could search all the news web sites to construct a daily "newspaper" containing only articles known to be of interest to you.
Search technologies are already here, but expect them to get smarter and smarter. For example, AltaVista have recently added something called Live Topics, which analyses and categorizes your search results to help you fine tune your results further.
Cybermedia Oil Change is software that will analyse your PC, and search the Web for any software upgrades available, offering to download and apply any updates it finds.
The whole way software is purchased is changing to be increasingly through the Internet. New technologies such as Marimba Castanet and Java allow you to have software that will automatically update itself each time you connect to the Internet.
Windows '98 is likely to feature this technology in the guise of "channels" that you can "tune into" to receive regular updates of news and software etc.
Again, already here. You can now make International telephone calls from computer to computer via the Internet at a fraction of existing prices. Only bandwidth and protected interests are stopping this taking off now.
Security and privacy were never high on the list of design objectives when the Internet was first designed. Although probably not as big a problem as newspaper headlines might suggest, you should be aware of the following. points.
If a message is vital, make sure you get a reply.
If in doubt, don't do it. Use the Net to get a voice phone or fax number and use that instead.
If you want to check, try sending a mail to the person. If that fails, or if you get a "not me" reply, then it may have been faked.
This is a tricky one. The simplest solution is don't do it. If you must do it, only download from trusted sites, and use a virus checker. However, even this is no guarantee as software downloaded from Microsoft has been infected in the past.
If in doubt, disable these features in your browser.
This is the Net equivalent of junk mail, and is simply a fact of life.
Note, most of these risks are no different to those you run using paper mail, cordless phones, mail order catalogues, or software off a bulletin board.
The difference is that you will be doing this electronically, more frequently and more publicly than before.
Common sense will get you past most problems.
Access to the Internet is usually though either the academic or commercial organisation you work for, or from an Internet Service Provider (ISP) contacted from home.
At work you are likely to have permanent access through a high-bandwidth link.
At home you are likely to have sporadic dial-up access through a modem.
Modem access is usually at local telephone rates, which is free in some countries, but by no means all. Additionally modem access can be relatively slow, making it proportionally more expensive.
This usually means that private users are less enamoured of high graphics content on web pages, are less inclined to download large software programs, and won't / can't afford to touch video content with a bargepole.
Many web page designers forget this simple fact.
The response time on the Internet varies according to a large number of factors, most of which are simply the consequences of its success and phenomenal growth rate.
This behaviour usually alternates as service gets so bad that people leave and new resources are commissioned. This used to happen for the whole UK's access to the US, but is more consistent now.
Commerce on the Internet has not fully developed yet. Whilst few companies or organisation these days do no have a presence on the Internet, it's still early days for commerce via the Net, though the predictions are quite astounding.
What is true is that searching for items via the net and advertising via the net are certainly here to stay. What is missing is a reliable, secure and international method of paying for services.
However, fear not, for it's on its way. For example see Digital's plan for Digital Millicent, a proposal to allow microcash to be charged each time you click on a particular service on a web page.
The days of free Internet services such as electronic newspapers may be limited.
So successful has the Internet been, that it's spawned a child - the Intranet. Intranet is the term used to describe the adoption of Internet technologies such as IP networks, email and browsers for the internal networking needs of an organisation.
One sign of this process is the increased use of HTML to produce Web-like documentation, and the distinction between on-line and off-line software and services begins to blur when documents reference on-line source material and are even capable of being updated automatically via the Internet when accessible.
This adoption of Internet technology is attractive because the software is cheap or free (due to its mass appeal), and familiar (also due to its mass appeal).
Equally the browser software that has made the Internet so popular with users is felt by many to be more intuitive than many traditional software interfaces.
This last point has been particularly taken on board by Microsoft, who are increasingly making browser Internet Explorer into a larger and more central part of their future operating systems.
© 1997-1999 John A Fotheringham and
Last Minor Update : 4 December '99