11.5 Issues and Trends
Learning Objectives
- Define information superhighway as it relates to the Internet.
- Identify ways to identify credible sources online.
- Define net neutrality.
- Describe some of the effects of the Internet and social media on traditional media.
By 1994, the promise of the “information superhighway” had become so potent that the University of California Los Angeles campus held a summit in its honor. The country quickly realized that they could harness the spread of the web for educational purposes; more than just the diversion of computer hobbyists, this new vision of the web would act as a constant learning resource that anyone could use.
The American video artist pioneer Nam June Paik takes credit for the term information superhighway, which he used during a study for the Rockefeller Foundation in 1974, long before the existence of Usenet. In 2001, he said, “If you create a highway, then people are going to invent cars. That’s dialectics. If you create electronic highways, something has to happen (The Biz Media, 2010).” Paik’s prediction proved to be startlingly prescient.
Al Gore’s use of the term in the House of Representatives (and later as vice president) had a slightly different meaning and context. To Gore, the Interstate Highway System during the Eisenhower era demonstrated a promise that the government would work to allow communication across natural barriers and that citizens could then utilize these channels to conduct business and communicate with one another. Gore saw the government as playing an essential role in maintaining the pathways of electronic communication. Allowing business interests to get involved would compromise what he saw as a necessarily neutral purpose; a freeway doesn’t judge or demand tolls—it functions as a public service—and neither should the Internet. During his 2000 presidential campaign, critics wrongly ridiculed Gore for supposedly saying that he “invented the Internet,” but in reality, his work in the House of Representatives played a crucial part in developing the infrastructure required for Internet access.
However, users needed to spend a certain amount of money to get connected to the web. In that regard, AOL paralleled the Model T—the dial-up Internet service put access to the information superhighway within reach of the average person. But despite the affordability of AOL and the services that succeeded it, certain demographics continued to go without access to the Internet, a problem known as the “digital divide,” discussed later in this section.
From speed of transportation to credibility of information (don’t trust the stranger at the roadside diner), to security of information (keep the car doors locked), to net neutrality (toll-free roads), to the possibility of piracy, the metaphor of the information superhighway has proved remarkably apt. All of these issues have played out in different ways, both positive and negative, and they continue to develop to this day.
Information Access Like Never Before
In December 2002, a survey by the Pew Internet & American Life Project found that 84 percent of Americans believed that they could find information on health care, government, news, or shopping on the Internet (Jesdanun, 2002). People expect the Internet to provide virtually unlimited access to educational opportunities. To make this expectation a reality, the Bush Administration’s 2004 report “Toward a New Golden Age in Education: How the Internet, the Law, and Today’s Students are Revolutionizing Expectations” recommended increased broadband Internet access. As Nam June Paik predicted, stringing fiber optics around the world would allow for seamless video communication, a development that the Department of Education saw as integral to its vision of educating through technology. The report called for broadband access “24 hours a day, seven days a week, 365 days a year,” saying that it could “help teachers and students realize the full potential of this technology (U.S. Department of Education, 2004).”
The Cloud: Instant Updates, Instant Access
As technology has improved, it has become possible to provide software to users as a service that resides entirely online, rather than on a person’s personal computer. Since people can now stay connected to the Internet constantly, they can use online programs to do all of their computing.
“Cloud computing” indicates the process of outsourcing common computing tasks to a remote server. The computer attached to the user’s monitor does not do or store the actual work, but rather it depends on the work of other (maybe many other) computers in the “cloud.” As a result, the user’s computer itself does not need that much processing power; instead of calculating “1 + 1 = 2,” the user’s computer asks the cloud, “What does 1 + 1 equal?” and receives the answer. Meanwhile, the system resources that a computer would normally devote to completing these tasks have become freed up to complete other tasks. In addition, users can store data in the cloud and retrieve it from any computer, making a user’s files more conveniently portable and less vulnerable to hardware failures like a hard drive crash. Of course, it can require quite a bit of bandwidth to send these messages back and forth to a remote server in the cloud, and the absence of a reliable, always-on Internet connection can somewhat limit the usefulness of these services.
The rise in web applications for mobile devices, such as the iPhone and devices that use Google’s Android operating system, owe thanks to cloud computing. Starting with 3G networks, which could augment the computing power of phones by giving them the ability to send data somewhere else for processing. For example, a Google Maps application does not calculate the shortest route between two places (taking into account how highways facilitate quicker trips than side roads and numerous other computational difficulties) but rather just asks Google to do the calculation and send over the result. 3G networks made this possible in large part because the speed of data transfer surpassed the speed of cell phones’ calculation abilities. As cellular transmission technology continues to improve with the rollout of the next-generation 4G networks (the successors to 3G networks), connectivity speeds will further increase and allow for a focus on ever-more-comprehensive provisions for multimedia .
Credibility Issues: (Dis)information Superhighway?
The Internet has undoubtedly proven a boon for researchers and writers everywhere. Online services range from up-to-date news and media platforms to vast archives of past writing and scholarship. However, the openness of the Internet means anyone with a few dollars can set up a credible-sounding website and begin to disseminate false information.
Taking the authorship of a site into account is a necessary step when judging information; more than just hunting down untrue statements, it can give insight into subtle biases that may arise and point to further research that needs to be done.
Just Trust Me: Bias on the web
Legacy media journalistic sources usually attempt to achieve some sort of balance in their reporting; given reasonable access, they will interview opposing viewpoints and reserve judgment for the editorial page. Corporate sources will instead tilt the information toward their product.
Often, people consider the web as a source of entertainment, even in its informational capacity. Because of this, sites that rely on advertising may choose to publish something inflammatory that readers will likely link and forward more for its entertainment value than for its informational qualities.
On the other hand, a website might attempt to present itself as a credible source of information about a particular product or topic, with the end goal of selling something. A website that gives advice on how to protect against bedbugs that includes a direct link to its product may not offer the best information on the topic. While so much on the web remains free of cost, it is worthwhile looking into how websites maintain their services. If a website gives something away for free, the information might have a bias because the money needs to come from somewhere. The online archive of Consumer Reports requires a subscription to access it. Ostensibly, this subscription revenue allows the service to exist as an impartial judge, serving the users rather than the advertisers.
Occasionally, corporations may set up “credible” fronts to disseminate information. Climate change remains a contentious topic, and websites about the issue often represent the bias of their owners. For example, the Cato Institute publishes anti-global-warming theory columns in many newspapers, including well-respected ones such as the Washington Times. Patrick Basham, an adjunct scholar at the Cato Institute , published the article “Live Earth’s Inconvenient Truths” in the Washington Times on July 11, 2007. Basham writes, “Using normal scientific standards, there is no proof we are causing the Earth to warm, let alone that such warming will cause an environmental catastrophe (Basham, 2007).”
However, the now-defunct website ExxposeExxon.com stated that the Cato Institute received $125,000 from the oil giant ExxonMobil, possibly tainting its data with bias (Exxon, 2006). In addition, Greenpeace ran ExxposeExxon.com as a side project, and the international environmental nonprofit may have its own reasons for producing this particular report. The document available on Greenpeace’s site (a scanned version of Exxon’s printout) states that in 2006, the corporation gave $20,000 to the Cato Institute (Greenpeace, 2007) (the other $105,000 was given over the previous decade).
This example highlights the difficulty of finding credible information online, especially when money gets involved. In addition, it shows how conflicting sources may go to great lengths—sorting through a company’s corporate financial reports—to expose what they see as falsehoods. What is the upside to all of this required fact-checking and cross-examination? Before the Internet, this probably would have required multiple telephone calls and plenty of time waiting on hold. While the Internet has made false information more widely available, it has also made checking that information incredibly easy.
Wikipedia: The Internet’s Precocious Problem Child
Could a platform provide free information available to all? That sounds like a dream come true—a dream that Wikipedia founder Jimmy Wales shared with the world. Since the site began in 2001, the Wikimedia Foundation (which hosts all of the Wikipedia pages) has become the sixth-most-visited site on the web, barely behind eBay in terms of its unique page views.
Organizations had long tried to develop factual content for the web but Wikipedia went for something else: verifiability. The guidelines for editing Wikipedia state: “What counts is whether readers can verify that material added to Wikipedia has already been published by a reliable source, not whether editors think it is true (Wikipedia).” The benchmark for inclusion on Wikipedia includes outside citations for any content “likely to be challenged” and for “all quotations.”
While this may seem like it’s a step ahead of many other sources on the Internet, a slight hitch remains: Anyone can edit Wikipedia. This has a positive and negative side—though anyone can vandalize the site, anyone can also fix it. In addition, calling a particularly contentious page to attention can result in one of the site’s administrators placing a warning at the top of the page stating that the site has not necessarily verified the information presented on it. Other warnings include notices on articles about living persons and articles that may violate Wikipedia’s neutrality policy. This neutrality policy attempts to mitigate the extreme views that may propagate on an open-access page, allowing the community to decide what constitutes a “significant” view that needs representation (Wikipedia).
As long as users do not take the facts on Wikipedia at face value and make sure to follow up on the relevant sources linked in the articles they read, the site serves as an extremely useful reference tool that gives users quick access to a wide range of subjects. However, the lack of authorial credit can lead to problems with judging bias and relevance of information, so users must take the same precautions with Wikipedia as with any other online source, primarily when checking references.
Security of Information on the Internet
As the Internet has grown in scope and the amount of personal information online has proliferated, securing this information has become a major issue. The Internet now houses everything from online banking systems to highly personal e-mail messages, and even though security constantly improves, this information has vulnerabilities.
A corporation or government can lose control of private information on the Internet. The same protocols that allow for open access and communication also allow for possible exploitation. Like the Interstate Highway System, the Internet offers impartial access to its users.
Can’t Wait: Denial of Service
Although many people increasingly rely on the Internet for communication and access to information, this reliance has come with a hefty price. Most critically, a simple exploit can cause massive roadblocks to Internet traffic, leading to disruptions in commerce, communication, and, as the military continues to rely on the Internet, national security.
Distributed denial-of-service (DDoS) attacks work like cloud computing, but in reverse. Instead of a single computer going out to retrieve data from many different sources, DDoS refers to a coordinated effort by many different computers to bring down (or overwhelm) a specific website. Essentially, any web server can only handle a certain amount of information at once. While the largest and most stable web servers can talk to a huge number of computers simultaneously, even these have limits.
A DDoS attack on government servers belonging to both the United States and South Korea in July 2009 rendered many U.S. government sites unavailable to users in Asia for a short time (Gorman & Ramstad, 2009). Although this did not have a major effect on U.S. cyber-security, the ease with which attacks could exploit these servers troubled experts. In this case, the DDoS attacks utilized an e-mail virus known as MyDoom that essentially turned users’ computers into server-attacking “zombies.”
Net Neutrality
Most Internet users in the United States connect through a commercial Internet service provider (ISP). The major players—Comcast, Verizon, Time Warner Cable, AT&T, and others—provide portals to the larger Internet, serving as a way for anyone with a cable line or phone line to receive broadband Internet access through a dedicated data line.
Ideally, ISPs treat all content impartially; any two websites will load at the same speed if they have adequate server capabilities. Service providers wish to modify this arrangement. ISPs have proposed a service model that would allow corporations to pay for a “higher tier” service. For example, this would allow AOL Time Warner to deliver its Hulu service (which Time Warner co-owns with NBC ) faster than all other video services, leading to partnerships between Internet content providers and Internet service providers. The service providers also often foot the bill for expanding high-speed Internet access, and they see this new two-tiered service as a way to cash in on some of that investment (and, presumably, to reinvest the funds received).
Critics fear—and the reason the FCC introduced net neutrality rules—that such a service would hamper the ability of an Internet startup to grow its business. Defenders of net neutrality contend that small businesses (those without the ability to forge partnerships with the service providers) would be forced onto a “second-tier” Internet service, and their content would naturally suffer, decreasing inventiveness and competition among Internet content providers.
Before the 1960s, AT&T had permission to restrict its customers to using only its telephones on its networks. In the 1960s, the FCC launched a series of “Computer Inquiries,” stating, in effect, that any customer could use any device on the network, as long as it did not harm the network. This led to inventions such as the fax machine, which could not have occurred under AT&T’s previous agreement.
This proto–net neutrality rule protected innovation even when it “threatened to be a substitute for regulated services (Cannon, 2003).”
Digital Technology and Electronic Media
Content on the Internet competes with content from other media outlets. Unlimited and cheap digital duplication of content removes the concept of scarcity from the economic model of media; consumers no longer need to buy a physical CD fabricated by a company to play music, and digital words on a screen convey the news just as well as words printed on physical newspapers. Media must reinvent themselves as listeners, readers, and watchers have divided into smaller and smaller subcategories.
Traditional media companies have had to evolve to adapt to the changes wrought by the Internet revolution, but these media still have relevance in an online world. For example, social media can provide a very inexpensive and reliable model for maintaining a band’s following. A record company (or the band itself) can start a Facebook page, through which it can notify all its fans about new albums and tour dates—or even just remind fans that it still exists. Social networking allows anyone to discover a band from anywhere in the world, leading to the possibility of varying and eclectic tastes not bound by geography.