As with many recent innovations, the genesis of the Internet lies in U.S. technology developed during the Cold War. In the days when the U.S. and U.S.S.R. were racing to the moon, the Department of Defense created the Advanced Research Projects Agency (ARPA) to spearhead cutting-edge military research. Scientists at ARPA wanted to find a way to connect computers located hundreds or thousands of miles from each other. In 1969 the ARPANet was born, linking four computers at universities in California and Utah.
The ARPANet slowly grew during the 1970s, connecting computers at universities, research labs, and government agencies. By 1981 more than 200 computers were linked to the network. By that time it was no longer limited to military projects. The Defense Department and National Science Foundation opened access to the broader scientific and academic community.
Meanwhile, several corporations, universities, and agencies in the U.S. and Europe began building their own computer networks. Everyone realized these networks needed to trade information with each other. Building on ARPANet technology, scientists developed standards for an Internet – an interconnected network of networks.
Then along came the personal computer. In the 1980s and ’90s, millions of PCs appeared in homes and offices. In 1989 British computer scientist Tim Berners-Lee led the development of a system that allows people to navigate the Internet using “pages” of text and images on a computer screen – a creation he dubbed the World Wide Web. Corporations got busy connecting computers around the globe. By the dawn of the twenty-first century, an estimated 360 million people had access to the Internet. By 2009, that number had reached more than 1.7 billion.