BACK TO THE HERD
If you’re reading this, you shouldn’t need an introduction to the Internet. Just to get to this page, you’ve already mastered the basics of using a browser to access information on this amazingly powerful medium.
The Web literally puts the world on your computer. You can now access media produced not only by huge corporations but also by ordinary people much like yourself. Indeed, you may already be a “web publisher” without thinking about it in those terms (we’ll take a closer look at social media in the next chapter).
Thus I shouldn’t have to work too hard to sell you on the idea that the Web is worthy of your attention. So we’ll focus instead on understanding why the Internet works the way it does and what you can do with it.
Anyone who’s been to law school will tell you that often the question “can you sue?” is nowhere near as important as “whom can you sue?” The Internet provides perfect examples.
Say a college student has a blog on Blogger. He gets mad at one of his professors, so he gets on his blog and falsely accuses her of stealing money from the college. This defamation could give the professor the right to sue the student for libel. But being an average college student, the guy doesn’t have much money, which makes it hard to interest a lawyer in taking the case.
But what if the professor could sue not just the student but also the company that hosts his blog? Blogger is owned by Google, a multi-billion-dollar company with deep pockets that any lawyer in America would love to stick a hand into.
Prior to 1996, the question was very much up in the air. But the Communications Decency Act contained a provision – section 230 – that protected service providers from lawsuits based on content they didn’t create themselves. When the Supreme Court declared the bulk of the CDA unconstitutional, it left section 230 in place.
So now companies that merely host content created by users can’t be successfully sued if that content defames or otherwise injures someone.
“Protecting the children”
By the mid 1990s Internet usage had gotten big enough that even Congress noticed it. The web was a new way to communicate, so the government’s authority – or lack of same – to regulate its content was untested. So in 1996 the House and Senate passed the Communications Decency Act, and President Bill Clinton signed it into law.
Like many “protect the children” measures, the law went well beyond anything designed to safeguard kids. The regulation made it a crime to use the Internet to publish obscene or indecent material “in a manner available to a person under 18 years of age.” As underage users could of course access anything anyone else could see on the web, the law effectively banned indecency on the Internet, a restriction that could have included everything from soft-core porn to harsh language.
Opponents of the law immediately sued the government, asserting that it violated the First Amendment’s protection for free speech. The Supreme Court agreed, striking down the law as a violation of the Constitution.
Congress went back to the drawing board, and in 1998 they passed the Child Online Protection Act. Though the new law was limited only to “commercial distributors of material harmful to minors,” the definition was again so broad that it would have significantly restricted speech on the web. Though it bounced around the courts for awhile longer than the CDA, eventually it too was declared unconstitutional.
For round three (called the Children’s Internet Protection Act), Congress focused in on schools and libraries receiving funding assistance from the federal E-Rate program to install filters on their computers blocking sites with indecent content. This restriction wasn’t a ban on the Internet as a whole. Rather, it was a condition placed on federal dollars; if libraries didn’t want to install filters, they didn’t have to take the government’s money.
This time the Supreme Court found the law constitutional, provided that adult users could have the filters turned off.
Fanning the flames
Take a medium that has the potential to reach millions of people, make it interactive so users can send messages as well as receive them, and eventually you’re going to unearth the dreaded substratum of jerks. If you’ve spent any time online, you’ve probably already run across at least a few people who seem to have nothing better to do than sit at their computers day and night dishing out insults about everybody and everything.
Such users are commonly known by insulting terms such as “trolls” and “flamers.” Though one expects extremists to crop up in discussions of controversial topics such as abortion, “flame wars” can also erupt over such seemingly innocent topics as tipping in restaurants or eating a vegan diet.
Naturally flamers are a real nuisance in social media, so we’ll encounter them again in the next chapter. But they creep in where they aren’t encouraged as well. “Comments” sections are particularly vulnerable. In general inviting reader feedback is a valuable tool that helps make web communication better. But when a flamer steps in, he can ruin the experience for everyone. Several prominent news outlets – including Reuters and The Week – have removed comments sections to help control flame wars.
Psychologists have more than one explanation for what motivates such people. Unfortunately, they can be hard to deal with. Unless they make threats or otherwise violate the law (such as regulations against cyberstalking), they’re pretty much free to lurk the net looking for opportunities. Unless the site’s webmaster deletes what they write, they just have to be endured.
The 8sails story
In connection with our consideration of making money online, let me suggest a radical concept: it isn’t necessarily necessary to make money.
Just about every other mass medium requires a lot of money to get started. Radio and TV stations are priced well into the millions. The cheapest of movie productions still costs more than a good used car. Even the cheapest of magazines still requires a printing budget, not to mention a great deal of time and effort.
The web, on the other hand, costs little or nothing. With just a cheap computer and an Internet connection, you can set up a blog or even a whole web site. Your presence will be just as accessible to millions of potential readers as every other spot on the web.
So why worry about turning a profit? Money’s nice, no doubt about it. But it also comes with obligations. If you start trying to cater to audiences in order to increase your page views and up your ad revenue, you run the risk of generating what you think people will want rather than what you want to do. Sure, in the real world you have to bow to pressure in order to pay the bills. But if you’ve got a job that pays the bills anyway, online media work is cheap and easy enough to do on the side for the sheer pleasure of doing it.
Consider 8sails.com. Maintaining the site is cheap enough (around $100 per year) that I don’t have to concern myself with whether or not it makes any money. And the blogs I write for are even cheaper (i.e. completely free) to set up and maintain. I suppose I could “monetize” the site – put ads on all the pages – but then I’d have to devote a lot of effort to marketing and drawing in more readers. As it is, I can spend my time writing movie reviews, creating blog entries and working on special projects such as the Survival Guide.
Besides, as a reader would you prefer the site as it is or would you rather have each page “decorated” with ads for weight loss pills and dating sites?
From rags to riches
The World Wide Web also presents several opportunities for you to be your own boss. We’ve already discussed the importance of income from advertising, which is easy to get going but won’t provide quit-your-day-job money unless and until your site builds a significant following.
Don’t forget that even if you don’t want to start your own web site, you can freelance for other people.
The “sell something” option might seem daunting at first, but give it a second look. Run “eBay success stories” through a search engine and you’ll get no end of tales of people who’ve used the popular web “auction” service to turn piles of trash into stacks of money. Many artists also market their creations – just about anything from ceramics to T-shirts – on stand-alone sites or via web marketplaces such as etsy.com.
Some folks even make money on the web the way “flippers” profit in the real estate market. They buy potentially popular domain names from owners who are doing a bad job (or no job at all) of maintaining them, fix them up (improve the layout, add features and so on), and then sell them for a profit to companies that could put them to good use.
Getting a job without going to engineering school
In the beginning, web design was more a matter of programming than of art. If you couldn’t code fluently in HTML, you couldn’t get the web to do much of anything. Now of course the job market is a lot more wide open.
I’m not going to devote a lot of space to the technical end (low-level programmers, hardware experts and the like). Do a search on “Internet jobs” and you’ll get tons of advice about how to start careers, not to mention a ton of job hunt sites clamoring to help you find employment once you’ve got the necessary training.
Instead, let’s take a look at the web-based careers that more closely resemble traditional media jobs. If you’re doing the kind of work that other media do, you can expect the job requirements to be pretty much the same. So if you go to work for a magazine’s web site (or an unaffiliated web site that does the kind of stories magazines typically tackle), you’ll need to have the same set of skills and talents as any other magazine writer, photographer, artist or editor.
In addition, media web sites need web designers with media skills. For this kind of job you’ll need to be about to create pages, which will require you to be able to use web design software, probably do a little HTML coding as well, and work with whatever CMS your employer uses. You’ll also increase your value in the job market by sharpening your skills with multiple media. If you can get web text, video on demand and podcasts to all work together to tell a story, employers should love you to pieces.
Nobody owns the Internet. You’ve probably heard that a thousand times (including at least once or twice already in this chapter). But that doesn’t necessarily mean that nobody controls significant chunks of it.
Some countries place strict limitations on their citizens’ web usage and punish violators severely. In the United States the government doesn’t have the authority to restrict web content other than use of the Internet to commit crimes such as trafficking in child porn. However, that leaves a serious issue unresolved: if companies that own chunks of the Internet start imposing their own content restrictions, will the government do anything to stop them?
On the surface this might seem like a question that doesn’t bear asking. Why would an ISP bother restricting its customers’ access to the Internet? If the company that sold you access suddenly slowed down your connection or cut you off from your favorite sites, wouldn’t you fire it immediately and switch to a different service? Further, it seems like more trouble than it could possibly be worth. Why would an ISP care what sites you used or didn’t use?
In the real world, however, access is a big money issue. At home a friend of mine is using AT&T as an ISP, and – among other things – he uses his home Internet connection to watch movies on Netflix. From the phone company’s perspective, he’s a couple of kinds of problems. First, when he streams movies to his computer or television, he’s using a lot more “bandwidth” (tying up more of the company’s resources by downloading a lot more data) than if he just used his connection for email and course work. Second, if he gets all his video entertainment from Netflix, he isn’t likely to pay extra for AT&T’s UVerse television package.
So if the phone company decides that he has to pay extra to use enough bandwidth to access Netflix, he can just switch to a different ISP, right? But to whom can he switch? Time Warner provides Internet service in his neighborhood, but they’re in the same boat as AT&T, with a cable package they want to sell and lines they don’t want him to “over-use.” Sure, there are other ISPs out there as well, but how long before they develop the same control issues or get bought up by a bigger company with the media conglomerate attitude about net usage?
The FCC used to maintain an “open internet” policy that prohibited ISPs from charging extra for access to high-bandwidth sites. For example, neither Netflix nor its subscribers could be charged more for heavy usage. However, in December 2017 the commission voted to abandon net neutrality protections, creating many currently-unanswered questions about the Internet’s long-term prospects.
If paywalls don’t work, what does?
People won’t pay directly for web content. Like most truisms, this one has a few exceptions. But for the most part people are used to being able to access everything on the web free of charge. Companies that try to set up “paywalls” (i.e. start charging for access to their material) rarely meet with a lot of success.
So if you can’t get people to pay for what you publish on the web, how can you make money with this massive medium?
The most obvious way is to earn money the old fashioned way: sell something. If you can’t get people to pay for your web content itself, then use the web to promote sales of something else. Many large corporations regard the web not as a profit center on its own but as a way to get people to buy whatever they manufacture, sort of like a big, elaborate advertising system.
And speaking of ads, that’s the way to make a more traditional media approach pay off. Readers may not be willing to give you their money directly, but if you have enough people visiting your site and looking at your pages, companies may well take an interest in running ads on your site. Services such as Google’s AdSense make this source of potential revenue easy to implement even on basic web sites. All web designers have to do is set aside short, wide spots for banner ads or thin, tall spaces for skyscrapers.
From thought to Web
There are three ways to put a web page together.
First, you can do it from scratch by writing the HTML code yourself. For simple (and I do mean simple) pages, that isn’t much of a problem.
For more complex pages, web design software comes in handy. It gives the designer a what-you-see-is-what-you-get look at how the page will appear in browsers, making it easier to incorporate links, graphics, animations and other more complex elements. Such software also includes functions that automatically upload files and keep things organized on the server.
For some sites, however, direct design by humans isn’t practical. Many blog sites cater to people who just want to post their thoughts (and maybe a photo or two) rather than getting caught up in the complexities of how the pages are actually put together. And big Internet retailers such as Amazon would have to employ a legion of designers if they created a human-made page for every product they sold.
Thus such operations tend to use Content Management Systems. These are basically programs that design pages automatically. Supply them with some basic information (what text is supposed to go on the page, how much a product costs and so on), and they do the rest of the work. CMS software creates pages, makes sure that they have all the proper links and that other pages link properly to them, and uploads them to the server all with little input from humans.
Search engines and portals
There’s no way to tell exactly how much information is available on the web. We can’t even tell how many sites there are. We can make a reasonably good guess about how many domains are actively registered (nearly 134 million as of this writing), but many of them aren’t actually attached to active web sites.
Of the sites that actually exist, some of them have only a single page (catsinsinks.com) while others include thousands (8sails.com). The amount of information accessible from a single page also varies. Catsinsinks.com may only have one page, but you can keep clicking the “show me another cat in a sink” button and view a truly astonishing number of pictures of cats in sinks.
So in this vast sea of information, how do you find the one perfect page that tells you exactly what you need to know? If some kind soul provides you with an exact URL, then you’re in business. Otherwise you’ll probably need to use a search engine.
As of this writing, Google is by far the most popular search engine, processing more than 70% of all Internet searches. Yahoo! and Microsoft’s Bing also see significant usage.
Some sites seek to be more than simple search engines. In addition to helping users “surf the web,” they also supply at least some original content of their own. AOL offers subscribers discussion areas and other resources in addition to its search features. Search giant Google now owns a host of services, including Google Documents, Gmail and YouTube. Such multi-service sites are sometimes referred to as “portals,” as they’re sort of windows to the web world rather than just simple search processors.
HyperText Mark-up Language is the computer code that makes the web work. It defines how information is displayed on your screen in your browser, and it simplifies how you move from page to page and from site to site.
HyperText Transfer Protocol uses Uniform Resource Locator addresses to find pages on the web. To get directly to The Herd (the Media Survival Guide’s home page), you type “http://www.8sails.com/the-media-survival-guide.html.” That tells your browser to use HTTP to go to the Herd’s URL. Most browsers are smart enough to know that you’re using HTTP, you want to go to an address on the World Wide Web and that you’re probably looking for a file full of HTML commands. So if you forget to type everything except “8sails.com/the-media-survival-guide” you’ll still probably end up where you need to go.
Most of the rest of HTML tells your browser how to display the page on your screen. Needless to say, even simple pages can require complicated sets of commands to make them work.
The browser is the go-between that takes information on the web and displays it on your computer in a useful and hopefully visually appealing way. They take files full of HTML commands and display them according to the commands’ instructions.
As already noted, the first big browser was Netscape, and the code for it eventually became Firefox. Both Microsoft and Apple have their own browsers (Explorer and Safari, respectively). Internet giant Google has a browser called Chrome. You can also find other browsers out there, such as Lynx and Opera.
For the most part browsers all work the same. They have to. Otherwise web designers would have to create not one web site but several, one for each different browser. Fortunately for designers and users alike, web browsers all start with the same HTML instructions and obey them in pretty much the same way.
However, some discrepancies do occasionally crop up. Thus designers can generally adopt a good-for-one-good-for-all approach, but advanced features still have to be tested in more than one browser to avoid glitches that can affect a lot of users.
Nets and domains
The Internet is a vast network of networks. The technical way this works gets complicated in a hurry. So here’s a quick run-down of how I’m using it to work on the Media Survival Guide. As you’re reading this, think about how you use the Internet for your own work.
I did most of the writing for the core sections of the Survival Guide (such as what you’re reading now) as part of a sabbatical, so I was working mostly from home. The Mac on my desk is connected to a WiFi router (a basic wireless LAN in my house), which is in turn connected to telephone lines owned by the phone company. They connect me to my Internet Service Provider, which currently is Google Fiber. As the name implies, my ISP connects me to the Internet. By connecting to Google’s service, I can reach anywhere else on the net.
I accessed literally thousands of other computers in the course of doing research and setting up links for the Survival Guide. As I was working, I stored my notes and links on a wiki (a simple web page) on the Wikispaces server. I’ve since abandoned the wiki and now save links and the like using Apple’s Notes app (which mirrors all my notes between my computers, tablet and phone).
Back on my local computer, I use Muse (a web design program) to put pages together. Then I use my network connection to upload files to Dreamhost, the company whose servers host the 8sails web site.
“8sails.com” is the domain I own that reserves a location on the World Wide Web where my uploaded files can be found. It assures that when you go looking for the next section of the Survival Guide, it will be where your computer or smart device can find it. As you can imagine, that makes domains potentially highly valuable. Serious legal fights have erupted over who owns what on the web.
So it takes all the machines and networks listed just to get something as simple as the Survival Guide from me to you. Of course each step involves a combination of hardware and software too complex to list here. And if you’re reading this as an ebook, that’s even more tech to consider.
Without data, computers are useless. Everything you look at on the web – the text you’re reading, the pictures at the top of the page, the links and buttons that allow you to move from page to page, even the locations themselves – is made up of bits of information – data – stored on a computer somewhere. And if you want to save data on a long-term basis, it needs to be stored somewhere.
Storage is divided into two categories: internal and external. In general, internal storage is for information that you’ll need to access regularly on your computer and external storage is for data you need to take with you to another location.
For some time now the standard device for internal storage has been the hard disc drive. Built into the chassis of the computer itself, this set of magnetic discs can store large amounts of data and access it rapidly. In addition to the files you’re actively using, the hard drive stores your computer’s operating system and other software you use to do what you need to do.
For external storage, on the other hand, technology changes at a faster pace. The earliest removable storage devices for home computers were cassette tapes, the same kind used for recorded music. These were ever so slow and unreliable, but they were quickly replaced by floppy discs (thinner, cheaper cousins of hard drives). Later innovations included CD-ROMs, Zip discs, and the currently-popular USB (or thumb or jump) drives.
Now, thanks to the web, we have “the cloud,” a new term for the old practice of storing files on a server somewhere else on the Internet rather than on your personal computer. The cloud has advantages over local storage, such as accessibility (you can get to your files from anywhere on the Internet) and security (the server’s owner backs it up, so you don’t have to worry as much about a crash eating all your files). But it has drawbacks as well, such as accessibility (in some locations Internet access is painfully slow or even nonexistent) and security (files on servers are more vulnerable to hackers because they’re “out there” rather than on your individual machine that’s harder to find).
Many projects make use of multiple storage technologies. Work on this Survival Guide was completed using a combination of Word files stored on my local computer and outlines and notes stored in “the cloud.”
Networking computers together involves more than connecting them with some wire and hoping they can work it out between themselves. Naturally a full consideration of the inner life of computer networks is well beyond the scope of a Survival Guide entry (not to mention my own technical knowledge about the subject). However, a few net basics will help us get a handle on what we’re doing.
At their most basic level, networks aren’t generally thought of as networks. A computer with a scanner and a printer directly connected to it is technically a Personal Area Network.
The next step up is the Local Area Network. Some LANs aren’t too complicated; Appletalk used to be a simple way to connect a roomful of Macs together without too much fuss or expense. But for anything bigger, the business gets tricky. Even a simple wireless network (like the WiFi hub I use at work or even the less sophisticated one I have at home) requires sophisticated tech to make them run.
At the opposite end of the scale, Wide Area Networks span regions, countries, and in the case of the Internet the entire inhabited world. Often WANs are actually networks of networks. The Internet isn’t one single network. Instead, it’s an interconnected “web” of smaller LANs and WANs.
Cats that play the keyboard. Cats that want cheeseburgers. Cats that jump in and out of boxes. Cats that merge with Pop Tarts and fly around outer space shooting rainbows out of their butts.
Welcome to the wide, wonderful world of memes.
Once the web became popular enough to count as a mass medium, trends started to appear. Unlike the well-manicured messages from other media, these “memes” arose spontaneously with no particular logic. For whatever reason, some images, phases and the like just seemed to catch on.
For example, at some point in the nebulous past one player in an online game asked a competitor where he was. “Im in ur base killin ur d00dz” came the reply. Somehow this managed to give rise to the “I’m in your X Ying your Z” meme, in which the original phrase was adapted in many strange ways, such as a photo of a cat eating a burger captioned “I’m in ur house eatin’ ur cheezburgers.”
Though memes are easy to criticize as frivolous at best and stupid at worst, they’re significant parts of our media culture for a couple of reasons. First, they arise more-or-less spontaneously, the work not of trained media pros but of average web users employing a little amateur ingenuity. And second, the inherent popularity of memes appeals to marketing folks, who seek ways to exploit them to sell products (I’m in ur awesome concert drinkin’ ur delicious, ice-cold Pepsi).
The Internet connects computers. The Web provides a framework for getting around. But human users need an interface to help them use the resources the net provides. The most popular software tool for using the Web is the browser.
Among early browsers Netscape was the most important. It was based on Mosaic, the first browser developed by the National Center for Supercomputer Applications at the University of Illinois. After graduation, co-writer Marc Andreessen moved to California and helped establish Netscape Communications.
The company provided its browser, Netscape Navigator, free of charge to anyone who wanted to download it. As web usage spread, so did Netscape’s popularity. In 1996 it hit the high point of its market share, with nearly 80% of people surfing the web using Netscape as a surfboard.
But then the company ran up against a serious rival: Microsoft. The manufacturer of Windows started bundling its own browser, Internet Explorer, with every copy of the operating system. With Explorer pre-installed on their computers, many users saw no reason to take the extra steps required to get Netscape up and running.
Seeing the writing on the wall, Netscape released its code base (the software that made it work) under an open source license, meaning that anyone who wanted to adapt it into his own browser was free to do so. AOL bought Netscape Communication and eventually stopped supporting the browser.
However, the open source – which came to be known as Mozilla – led directly to the development of other browsers, including the currently-popular Firefox system.
By the early 1990s personal computers found their way into many homes. Though they were useful for typing letters, balancing checkbooks and playing games, they suffered from a serious limitation: isolation. Without a connection between home users and the world of information available on the Internet, computers couldn’t truly become a medium of mass communication.
At the time hooking a PC up to the outside world required a modem that could dial out on a phone line and connect to … well, to start there wasn’t a lot to connect to. For some time a system called Compuserve was a popular dial-up destination, but it catered mostly to “techies,” people who already knew their way around computer systems.
But then America Online – originally a downloadable game company – started an online service with a couple of interesting twists. First, it set up an easy-to-use GUI, making the service accessible even to people without tons of computer experience. And second, it promoted the heck out of itself. For some time it aggressively distributed free AOL software discs, mailing out so many that people began to devise unintended, creative uses for them.
At first AOL subscribers could access only services on the AOL system itself. Eventually, however, the company supplemented its own offerings with connections to Usenet and the Web.
For some time AOL dominated the home market, hitting a high water mark of 30 million subscribers. It even wrangled an expensive merger with media giant Time Warner. But as the dial-up market gave way to faster Internet connection technologies, AOL’s sun slowly began to set.
Apple Inc. already had a good thing going into 1980. In the late 1970s the company controlled a large share of the limited market for personal computers with the Apple II. But the new decade brought bad fortune. The Apple III, the company’s latest and greatest, developed stability problems and had to be recalled and redesigned. But worse, competition from IBM and its clones shoved Apple to the sidelines.
But the company started bouncing back in 1984, launching its new Macintosh computer with an expensive, one-time-only ad that ran during the Super Bowl. Though the new system wasn’t as popular overall as PC-compatibles, it found acceptance in some markets such as graphic design.
Though Apple didn’t immediately seize control of the industry, it introduced the highly popular idea of making computing easier with a Graphical User Interface. Microsoft responded by creating Windows, though it didn’t find GUI success until Windows 3 in 1990.
The Macintosh line continued to gain in popularity throughout the 90s and 00s. Then Apple leapfrogged its way into the lead in the personal tech market with its line of iProducts: the iPod, iPhone and iPad.
As you learned in the Survival Guide’s introduction, new technology moves from the nerd stage to the mass medium stage only if a company invests some money in developing it. In the case of the personal computer, that company was International Business Machines. Several companies in the 1970s sold computers for home use, but IBM – with years of experience building big, expensive machines for the government and large corporations – was the first to put enough cash and marketing muscle into personal computers to make them sell.
In 1981 the company released the Personal Computer Model 5150, better known as the PC. MS-DOS, software designed by a small start-up company called Microsoft, controlled the machine.
Thanks to a loophole in the contract between IBM and Microsoft, the software company was free to sell DOS to anyone who could build a computer with hardware designed to run it. So several other companies jumped on the PC bandwagon, marketing “clones” or “PC compatibles” for lower prices than what IBM was charging.
Thus the personal computer found its way into businesses, schools and homes where no previous system could have gone.
The World Wide Web
A lot of people treat “Internet” and “Web” as synonyms. So let’s start with an understanding of the difference. The Internet is the network (or network of networks) that allows computers to communicate with one another. The Web, on the other hand, is the software – the “medium” – that defines how that communication works.
In the early 1990s programmers began to develop systems that used hyperlinks to connect information. These links could be used to “navigate” around the Internet, accessing information on whatever computer it was stored. Then in 1993 a team at the National Center for Supercomputing Applications designed Mosaic, the web’s first browser. It used HyperText Markup Language to standardize the way information was displayed on screens, making the web much easier to “surf.”
This new accessibility paved the way for people outside computer nerd circles to use the Web. By 1996 many businesses began setting up web presences, and the rest as they say is history.
On October 4, 1957, people in the United States got a bad scare. With almost no advance notice, the Soviet Union launched Sputnik 1, the first human-made object ever sent into space. In practical terms it didn’t do all that much, just orbited around the Earth broadcasting a beeping radio signal. But for Americans, who assumed they had a strong tech advantage over their Communist rivals, the news came as quite a shock.
The U.S. government responded to the crisis with a flurry of military and scientific projects designed to help us get caught up. One of these initiatives was the Advanced Research Projects Agency, a new section of the Department of Defense designed to coordinate the development of new military technology.
Several of the labs, military bases and universities working with ARPA had computers, but sharing data between them was a problem. Information had to be saved on bulky reels of magnetic tape and physically transported from one location to another, causing delays and posing security problems. If only the computers could talk to one another directly.
So engineers started work on a network, a series of connections that would allow computers to share data with one another. In 1968 the ARPAnet was born. Four universities in California and Utah started sharing data with one another. By 1970 it expanded to the East Coast, and by 1975 it was declared fully operational.
Thus the Internet was born. Okay, it took another step or two to convert the original military-only network into the massive communication giant we all know and love today. But the image of the parent can still be found in the child.
In particular, then and now the nets follow a distributed model. That means if one part of the network is disabled, all the other parts can still send information to one another. Back in the 1970s this feature was essential because early network hardware tended to fail more than it does now (yes, it actually used to be even worse), and on a distributed network a failure at one site wouldn’t shut down the whole show. It also made the network more resistant to enemy attack, an important feature for a military operation.
You’ll find that distributed structure mirrored in the observation that “nobody owns the Internet.” Individual parts belong to businesses, governments, colleges and individuals (people like you if you’re using your personal chunk of the Internet to read these words). But nobody owns or controls the whole thing.
In the beginning, computers were as big as houses. ENIAC, the world’s first electronic computer, weighed 30 tons, took up 1800 square feet and sucked up enough electricity to power 150 homes. Not exactly a desktop model.
The reason early computers were so big and hungry (and of course expensive) was the vacuum tube. Or to be more precise, it was the thousands and thousands of vacuum tubes that made up the computer’s “brain.” These tubes looked a little like light bulbs. Like light bulbs, they tended to burn out and need to be replaced. Unlike light bulbs, however, they didn’t light up. So figuring out which tube had to be replaced was a tedious task.
The computing world took a big step forward in the late 1940s and early 1950s with the invention of transistors. These new devices did the same thing as vacuum tubes, but they were smaller, more energy efficient, cheaper to produce and didn’t burn out. Computers were now the size of cars, a great improvement over house-sized machines but still not exactly ready for the consumer market.
The microprocessor was the final piece in the puzzle. Starting in the 1960s, engineers figured out how to make integrated circuits – “chips” – that did the job of dozens, then hundreds, then thousands, then millions of transistors. Microprocessor technology made it possible to fit entire computers onto single chips, allowing NASA to put enough computing power into a space capsule to make a trip to the Moon and allowing manufacturers back on Earth to market pocket calculators.
By the early 1980s computers fit into briefcase-sized boxes and no longer cost a small fortune. Businesses from large to small could afford to put them on employees’ desks, and people could buy them for home use as well.