13.09.2013 Views

The Internet - A Case Study

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

<strong>The</strong> <strong>Internet</strong> - A <strong>Case</strong> <strong>Study</strong><br />

Introduction<br />

Rod Munday<br />

<strong>The</strong> two media case study lectures conclude this week with an account of<br />

the development of the internet. Unlike many other inventions, it is hard<br />

to conceive of the internet as being the work of one person, or even a<br />

group of individuals, rather it is the result of the coming together of many<br />

different kinds of technologies. <strong>The</strong> internet is part computer hardware,<br />

part communications network and part software. However, we must<br />

remember that these technologies were not invented specifically with the<br />

internet in mind. All of these technologies emerged out of quite separate<br />

research paths with their own goals and motivations. In this sense the<br />

story of the creation of the internet can be regarded not so much as a<br />

linear narrative, but as a genealogy, and visualised as a kind of family<br />

tree consisting of generations of different discoveries. Or more accurately<br />

perhaps, the internet genealogy can be represented as a series of cuttings<br />

from different family trees all growing together in the same pot.<br />

In order to make sense of this complex metaphor, we will be looking at<br />

three branches of the internet family tree. Firstly the development of the<br />

computer, secondly the development of the network that would<br />

eventually become the internet and thirdly the personal computer<br />

revolution which facilitated its widespread adoption in the 1990s.<br />

1. <strong>The</strong> Computer Family Tree<br />

Liebniz and Binary<br />

A crucial invention for computing emerged three hundred years before<br />

the first electronic computer. This is the binary system of mathematics.<br />

<strong>The</strong> binary system, or base two, is a method of calculation readily<br />

adaptable to electrical circuitry, because binary exists only as a series of<br />

ones and zeros, which can be readily made to correspond with the "on" or<br />

"off" states of an electrical switch. Thus, the binary system is the<br />

mathematical foundation on which all of today's digital technology is<br />

built. Binary was actually discovered in India in the third century BC<br />

(Wikipedia 2006a), but it became known in the West when it was<br />

re-discovered in the seventeenth century by Gottfried Liebniz, a German<br />

philosopher and mathematician working under the patronage of the<br />

1 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

Hanover royal family (Borgmann 1999, 141).<br />

fig. 2, Liebniz<br />

As a philosopher, Liebniz was famous for formulating the doctrine of 'the<br />

best of all possible worlds,' for which he was mercilessly caricatured by<br />

Voltaire in the novel Candide, as the hopelessly optimistic Dr. Pangloss<br />

(Russell 1991, 563). As a mathematician, apart from rediscovering binary<br />

he is also credited with inventing calculus independently of Sir Isaac<br />

Newton. <strong>The</strong> devout and God-fearing Liebniz regarded the binary<br />

system, as the mathematical expression of God's first act of creation: "let<br />

there be light and there was light." Liebniz thought creation itself was<br />

born out of a binary equation, reasoning that, out of divine unity (one)<br />

and formless nothing (zero) everything else could be generated<br />

(Borgmann 1999, 141).<br />

Babbage and <strong>The</strong> First Computer<br />

In 1822, Charles Babbage sketched out his designs for the first computer,<br />

the 'Difference Engine No. 1'. A machine he later unsuccessfully<br />

attempted to build. Affected by this failure, but nevertheless resolutely<br />

determined to try again, Babbage set about designing a second machine,<br />

an endeavour that would consume the rest of his life. This machine, the<br />

'Difference Engine No. 2', was never built either. But Babbage's failure<br />

2 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

today has become a source of tantalizing speculations; that human history<br />

would have turned out very differently if he had succeeded. For example,<br />

his story inspired the 'Steam-punk' genre of science fiction created by<br />

William Gibson and Bruce Sterling (fig. 1).<br />

fig. 3, <strong>The</strong> first 'Steam-punk' novel<br />

However, such romantic flights of fancy were brought back down to<br />

earth, when in the early 1990s the London Science Museum<br />

commissioned the construction of what turned out to be a working model<br />

of Babbage's Difference Engine No. 2.<br />

3 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

fig. 4, Difference Engine No. 2<br />

While the design worked beautifully, the machine turned out to be little<br />

more than a very slow mechanical calculator. What this shows, apart<br />

from exposing certain excesses of cultural mythologizing, is that in order<br />

to be able to process significant amounts of information, a computer has<br />

to operate at light speed. This means that its switching has to be<br />

electronic rather than mechanical.<br />

<strong>The</strong> first electronic computer<br />

<strong>The</strong> prototypes for the first electronic computer started to appear around<br />

the time of the Second World War.<br />

fig. 5, Colossus at Bletchley Park, UK<br />

Most famously, there was Colossus, a valve-driven calculator built to<br />

4 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

crack the German 'Enigma' code. Colossus was built in 1943, by a team<br />

based at Bletchley Park in the UK, and headed by the brilliant<br />

mathematician Alan Turing. Turing's dreamt of a "universal machine,"<br />

which would run on Leibniz's binary code and be capable of performing<br />

any task it was programmed for. This dream was to inspire later<br />

generations of computer scientists.<br />

fig. 6, ENIAC, the first electronic computer<br />

<strong>The</strong> first truly electronic computer was built three years later in 1946 in<br />

the US, at the university of Pennsylvania, by John Eckert, Herman<br />

Goldstein and John Mulchay. Known as ENIAC (Electronic Numeral<br />

Integration and Computer), it was a valve driven behemoth, so massive<br />

that it filled an entire gymnasium. ENIAC also drew so much electrical<br />

power, that on the night when it was first switched on, electric lights all<br />

over the city of Philadelphia blinked.<br />

fig. 7, valves versus transistors, a size comparison<br />

In 1948, the shift to computer miniaturisation began with the invention of<br />

5 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

the transistor by William Shockley, John Bardeen and Walter Brattain,<br />

who later won the Nobel prize for their efforts. <strong>The</strong> transistor did away<br />

with the vacuum container of glass which housed the valve and thus<br />

shrunk the components of a computer to a fraction of their former size.<br />

Later still transistors would be etched onto wafers of silicon and the<br />

integrated circuit would replace transistors.<br />

In 1964, the future pace of miniaturisation was predicted by the then<br />

president of Intel, George Moore. "Moore's Law," stated that the amount<br />

of transistors that could be placed on a single circuit would double every<br />

eighteen months. A prediction which has, more or less, been true ever<br />

since. Thus, by the early 1960s, computers had grown smaller, filling a<br />

room rather than a gymnasium, and they were faster and more capable<br />

too.<br />

2. <strong>The</strong> <strong>Internet</strong> Family Tree<br />

Who invented the internet?<br />

<strong>The</strong> most often cited answer to the question, "who invented the internet?"<br />

is that it was an initiative of the US, 'Defence Department Advanced<br />

Research Project Agency,' or DARPA for short. In 1957, at the height of<br />

the Cold War, the Soviet Union launched the space satellite 'Sputnik' (fig.<br />

8).<br />

fig. 8, Sputnik<br />

6 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

Visible from earth, and shining like a new star in the night sky, Sputnik<br />

created a profound sense of anxiety in the minds of the American people.<br />

It seemed to them to be proof of the Russian enemy's vast technological<br />

superiority. Even though the satellite was actually little more than a radio<br />

transmitter capable only of broadcasting a repetitive beep back to Earth.<br />

As a result of this spectacular Soviet propaganda coup many bold<br />

initiatives were given the green light in the US. One of these was the<br />

Apollo missions to the moon, another was the development of the system<br />

that would eventually become the internet.<br />

<strong>The</strong> protean internet<br />

At first, the internet was just an idea for a single computer network. <strong>The</strong><br />

germ of the idea was first seeded by US military generals in the form of a<br />

question: "Was it possible to build a communications system that could<br />

survive a nuclear war?" (Back in the early 1960s, it seemed prudent to<br />

plan for such a terrifying contingency).<br />

fig. 9, Paul Baran<br />

In 1964 a researcher at a Cold War think tank called the Rand<br />

Corporation, Paul Baran (fig. 9) thought he had come up with an answer.<br />

Baran reasoned that, in order to have the best chance of survive an<br />

nuclear war, a communication system would have to be designed as a<br />

matrix of interconnecting nodes. And each node would have to be<br />

autonomous. In other words it would have to be capable of sending and<br />

receiving messages on its own, without taking instructions from<br />

elsewhere. <strong>The</strong> nodes therefore had to be computers, because, in the<br />

event of a full-scale nuclear exchange, computers were the only devices<br />

which could process the vast number of complex instructions necessary<br />

to keep the network running.<br />

7 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

To understand how this Baran's idea works, imagine series of dots<br />

connected by a lattice of lines as in fig. 10.<br />

fig. 10, a distributed network of nodes<br />

Each dot, or node, on this lattice represents a computer. <strong>The</strong> lines<br />

between the dots represent the communication lines linking the<br />

computers together. You can see from fig. 10, that there are a lot of<br />

potential ways for a message to get from one nodal point to another.<br />

Actually, compared to the centralised phone system, this is a very<br />

inefficient and expensive way of sending a message, but the redundancy<br />

in the system was also its strength, because it made it very robust.<br />

Messages could not only be routed via any nodal point, but they could<br />

also be copied and re-sent by any nodal point as well. Thus, the<br />

probability of receiving a message was much higher. In fact it was as<br />

high as the number of nodes in the system. Much later, this robustness<br />

would create problems for internet censorship, because in the words of a<br />

well known Hacker aphorism, "the internet sees censorship as damage<br />

and routes around it."<br />

As an additional safeguard, the messages that were sent and received by<br />

this system were also broken up into packets of information, each packet<br />

being separately addressed. For example, if the message was broken<br />

down into ten packets, each packet would be labelled "one out of ten,"<br />

"two out of ten," "three out of ten" and so on. In this way, the message as<br />

a whole would be much less susceptible to loss or damage, because the<br />

receiving computer would be able to tell that, say, parts "four" and "five"<br />

out of the ten packets had not been received and could request they were<br />

sent again.<br />

8 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

ARPANET<br />

In 1969, five years after the initial idea was proposed, the first network<br />

came 'online.' <strong>The</strong> reason for the delay was because it took that long to<br />

design the packet-switching protocols, that allowed the computers to<br />

'talk' to one another. <strong>The</strong> system was known as ARPANET, as a tribute to<br />

DARPA, its military sponsor. At first ARPANET consisted of just four<br />

computer nodes (fig. 11).<br />

fig. 11, ARPANET in 1969 - four nodes<br />

<strong>The</strong> data capacity of the entire system was also incredibly small by<br />

today's standards. <strong>The</strong> whole system could handle only 56,000 bits of<br />

information, or the equivalent of the data transfer from one 56Kb<br />

modem. As a consequence, only text messages were sent at first.<br />

9 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

fig. 12, ARPANET in 1971 - fifteen nodes<br />

During the 1970s, ARPANET grew exponentially and, as it grew, an<br />

interesting social phenomenon began to emerge. <strong>The</strong> military scientists<br />

using ARPANET started to post non-military communiqués and even<br />

gossip on the network. Also, because each user had their own personal<br />

account, the same message could be addressed to multiple recipients,<br />

which was how mailing lists got started. Gossiping on the network was<br />

of course frowned upon by the military authorities, since in the 1970s<br />

computers were so expensive that computer time was a precious<br />

resource, but it continued nevertheless.<br />

fig. 13, ARPANET 1980 — 70 network nodes, 100s of computers<br />

By the early 1980s, ARPANET users numbered in the thousands. In<br />

10 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

addition, it was also being used by non-military scientists and other<br />

academic institutions including the 'National Science Foundation,'<br />

'NASA' and the 'Department of Energy.' Because of this diversification,<br />

in 1984, DARPA decided to break up ARPANET into six networks, or<br />

domains as they were known. Each domain was given a specific address.<br />

For example, military communications would now have their own<br />

dedicated network, MILNET, which would be identified with the suffix<br />

".mil," while non-military communications would be divided up under<br />

several headings and given their own unique suffixes. Foreign countries<br />

chose to be denoted by their geographical locations, for instance ".uk" for<br />

Britain. Educational organisations were given ".edu", commercial<br />

operations ".com", and non governmental organisations and other non<br />

profit concerns ".org." Finally ".net" was reserved for other network<br />

gateways (Sterling 1993).<br />

<strong>The</strong> breaking apart of ARPANET meant that the system was no longer a<br />

single network, but rather a collection of interconnecting networks. This<br />

was possible because of some new improved switching protocols,<br />

developed by DARPA and known as TCP/IP (Transfer Control<br />

Protocol/Interconnecting Protocol). Using TCP, many different computer<br />

networks could be joined together into what became known as the<br />

'network of networks,' which was also called ARPA-INTERNET, or the<br />

INTERNET for short.<br />

3. <strong>The</strong> Home Computing Family Tree<br />

Counter Culture Computing<br />

As Manuel Castells observes, the first phase of the internet was a unique<br />

blend of military strategy, big science and counter culture innovation<br />

(Castells 1996, 351). <strong>The</strong> third of these factors found expression in the<br />

revolutionary rise of the personal computer in the 1970s and 80s. It was,<br />

in Castells' words, a technological blossoming of the [hippie] culture of<br />

freedom, individual innovation and entrepreneurialism (ibid., 5).<br />

11 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

fig. 14, the Altair 8800<br />

In 1975, MITS a company based in Albuquerque New Mexico build the<br />

first personal computer out of a microprocessor that was designed to be<br />

used in automated traffic lights. <strong>The</strong> computer was called Altair after a<br />

planet featured in an episode of Star Trek. Out of this development, two<br />

Harvard drop-outs were inspired to form a company to write software for<br />

the Altair. <strong>The</strong>y were Bill Gates and Paul Allen and the company was<br />

called Microsoft, which was also based in Albuquerque at the time.<br />

Meanwhile in Palo Alto California, a group of computer enthusiasts had<br />

been gathering for regular meetings of <strong>The</strong> 'Homebrew Club,' to show off<br />

their home-built computers. Two of the members of this club were Steve<br />

Jobs and Steve Wozniak. Jobs decided to form a company to manufacture<br />

the home computer his friend Wozniak had built. <strong>The</strong> company called<br />

Apple, was launched in 1976 with $91,000 capital. By 1982 the<br />

company's turnover had reached $583 million.<br />

12 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

fig. 15, the Apple II personal computer<br />

<strong>The</strong> phenomenal success of the Apple personal computer took IBM (then<br />

the major player in the computer market) completely by surprise. <strong>The</strong>ir<br />

own version of the PC had to be quickly assembled from<br />

non-proprietorial parts, using a third-party operating system licensed<br />

from Microsoft. Interestingly, a decade and a half later, Microsoft would<br />

similarly fail to see the potential of the <strong>Internet</strong>, and faced with the<br />

phenomenal success of 'Netscape Navigator' (invented by Marc<br />

Andreessen in 1993) was forced to create its own '<strong>Internet</strong> Explorer' by<br />

hastily reworking third-party software, sourced from an outside company<br />

called Spyglass. (Castells 2001, 176).<br />

Software in the Public Domain<br />

Bell labs, the inventers of the original protocols that ran ARPANET, also<br />

invented the UNIX operating system. This became the software that ran<br />

all of the computer nodes, which were now called servers. Bell labs was<br />

part of the US telecommunications giant, AT&T. In the 1970s, AT&T<br />

enjoyed a military enforced monopoly on all telephone communications<br />

within the US. But because of the unfair advantage this gave them over<br />

other companies, AT&T were forced by the US Government to release all<br />

of their software into the public domain, only charging for the cost of<br />

distribution. This meant that UNIX was available, essentially as<br />

freeware. <strong>The</strong> same conditions applied to the transmission protocol<br />

TCP/IP, also invented by Bell Labs. Both of these systems later became<br />

the backbone of the internet. TCP was regarded as such a robust protocol,<br />

that the joke went that it could even connect two tin cans and a piece of<br />

string together. Actually successful experiments were later run with it<br />

using homing pigeons as the message carrier (Wikipedia 2006b). <strong>The</strong><br />

robustness and simplicity of these systems, combined with their<br />

13 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

cheapness meant that it was relatively easy for any organisation who used<br />

computers to go online in the 1980s.<br />

<strong>The</strong> World Wide Web<br />

fig. 16, surfing the Net<br />

However, the uptake for the internet could not be described as<br />

phenomenal until the mid 1990s. Asa Biggs and Peter Burke refer to a<br />

book, Technology 2001: <strong>The</strong> future of Computing and Communications,<br />

published in 1991, that made no mention of the internet (Briggs and<br />

Burke 2005, 244). One of the first internet applications to attract wider<br />

publicity was the World Wide Web (WWW) which was developed in<br />

1989 by a British researcher, Tim Bernards Lee, working in the CERN<br />

particle research facility in Switzerland. <strong>The</strong> 'Web,' as well as email, were<br />

perceived as the first "killer apps" of the internet. Especially after the<br />

Mosaic Browser was released as freeware in 1992. Mosaic was the first<br />

Browser to run on the Windows operating system rather than on UNIX.<br />

This development, and the launch of Netscape Navigator soon<br />

afterwards, greatly simplified the activity of web browsing.<br />

Often, the World Wide Web and the <strong>Internet</strong> are taken to mean the same<br />

thing. This is not the case. <strong>The</strong> internet is a network consisting of other<br />

networks of computer terminals, while the World Wide Web is a means<br />

of accessing information over the <strong>Internet</strong>. Berners-Lee's who formed the<br />

World Wide Web Consortium in 1994, made his idea available freely,<br />

with no patent and no royalties. <strong>The</strong> World Wide Web Consortium later<br />

championed the idea that the internet should be based on<br />

non-proprietarily technology as a foundational principle. A utopian<br />

notion perhaps, but one that has proved to be surprisingly robust, with<br />

success stories like Google proving that a company does not have to<br />

14 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

charge users to make a profit online.<br />

<strong>The</strong> Web uses a language called hypertext which allows documents to be<br />

linked together by a series of hyperlinks. Hypertext was invented by Ted<br />

Nelson in the early 1960s. Back then Nelson harboured his own ideas of<br />

a global computer network, Project Xanadu, which was later somewhat<br />

overshadowed by the success of the World Wide Web. A situation that<br />

Nelson remains bitter about to this day. However Nelson, for his part,<br />

was inspired by Memex, an ambitious idea to store all the world's<br />

knowledge on microfilm. Memex was an idea proposed in 1945 by<br />

Vannevar Bush, the then chairman of the National Advisory Committee<br />

for Aeronautics, and a personal advisor to president Roosevelt. <strong>The</strong>se<br />

examples indicate that the dream to build a repository of all the world's<br />

knowledge is not new. In fact the great library at Alexandria, constructed<br />

around 300 B.C., can also be conceived of as such a repository.<br />

Conclusion<br />

Fig. 17, <strong>The</strong> internet: a force for good or evil?<br />

<strong>The</strong> rise of the internet in the 1990s has inspired many superlative<br />

descriptions. It was not only considered to be the most important medium<br />

of the twentieth century (Briggs and Burke 2005, 244), but it has also<br />

been called the most important discovery since writing and an application<br />

that will usher in a new age, the information age (Castells 1996, 328).<br />

Hopefully, from reading this account, you will be more aware of the<br />

social processes responsible for the internet's creation. Long before the<br />

internet existed, in fact two thousand seven hundred years ago, human<br />

beings discovered a way to preserve their thoughts externally, in the form<br />

15 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

of writing. This was the true start of the information age. However, even<br />

before then, communication has always been something which has<br />

defined our species, and therefore, it is not surprising that it is something<br />

we have always striven to achieve, whether this be in the form of posting<br />

personal messages on ARPANET, or inventing smoke signals.<br />

Regarding technological determinism, Castells argues that, in a sense, it<br />

is a false dilemma, since "technology is society and society cannot be<br />

understood or represented without its technological tools" (Castells 1996,<br />

5). Raymond Williams echoes these sentiments, reasoning that the debate<br />

between technological determinists and social determinists is essentially<br />

a sterile one, because "each side has abstracted technology away from<br />

society" (Williams 2003, 6).<br />

<strong>The</strong> rise of the internet has inspired many people to speculate about<br />

profound social consequences. Not all of these speculations have been<br />

positive, but the majority have been misinformed. Generally speaking, a<br />

technologically determinist argument can be characterised by the fact that<br />

it ignores history, culture, and the social context in which technologies<br />

operate. Technology is cast as a slave diver, beating a drum to which all<br />

human beings are compelled to dance to, whether they desire to or not.<br />

Those who spread fears about new technology generally do so in<br />

ignorance of the fact that fears over new technology are themselves<br />

nothing new. <strong>The</strong> problem with their arguments is that they downplay the<br />

importance of human desire. For while many technologies have been<br />

credited with creating desire, no technology has yet been invented that is<br />

capable of removing it. As Freud pointed out, desire has no natural object<br />

(Appignanesi: 1992, 72), and therefore it is always going to be a law unto<br />

itself. It is for this reason that word processors will never replace pen and<br />

ink, as long as people desire to write letters by hand, and the internet will<br />

not replace books, as long as people desire to read them. While the point<br />

is conceded that the majority of people no longer desire to write with<br />

quills, or carve markings into the side of clay pots. This does not mean<br />

that there is any technological prohibition capable of preventing a person<br />

doing these things. Indeed, if I desire to write with a quill, all I need to do<br />

is find a feather.<br />

References<br />

Apignanesi, Richard, Oscar Zarate, Freud for Beginners, London: Icon<br />

books, 1992.<br />

Briggs, Asa and Peter Burke (2005): A Social History of the Media,<br />

Oxford: Polity<br />

16 of 17 12-01-08 3:28 PM


Mediated Communication 5 http://www.aber.ac.uk/media/Modules/MC10020/the_internet.html<br />

Borgmann, Albert (1999), Holding On To Reality, London: University of<br />

Chicago Press.<br />

Castells, Manuel (1996), <strong>The</strong> Rise of the Network Society, London:<br />

Blackwell Publishers.<br />

Castells Manuel (2001), "Epilogue," in Pekka Himanen’s, <strong>The</strong> Hacker<br />

Ethic: the Spirit of the Information Age, London: Vintage Publishers.<br />

Russell, Bertrand (1991), History of Western Philosophy, London:<br />

Routledge.<br />

Sterling, Bruce (1993), "A Short History of the <strong>Internet</strong>," URL<br />

http://w3.ag.uiuc.edu/AIM/scale/nethistory.html<br />

Wikipedia (2006a), " Binary numeral system," URL<br />

http://en.wikipedia.org/wiki/Binary_numeral_system<br />

Wikipedia (2006b), "<strong>Internet</strong> protocol suite," URL http://en.wikipedia.org<br />

/wiki/Tcp/ip [Accessed 10/10.06]<br />

Williams, Raymond (2003), Television (Technology and Cultural Form),<br />

London: Routledge<br />

Additional Sources<br />

Good images of ARPANET and other images can be found on, "An Atlas<br />

of Cyberspace," URL http://www.cybergeography.org/atlas<br />

/historical.html<br />

17 of 17 12-01-08 3:28 PM

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!