"

11.5 Issues and Trends

By 1994, the promise of the “information superhighway” had become so potent that the University of California, Los Angeles campus held a summit in its honor. The country quickly realized that it could harness the spread of the web for educational purposes; more than just a diversion for computer hobbyists, this new vision of the web would act as a constant learning resource that anyone could use, empowering individuals with unprecedented access to knowledge and information.

The American video artist pioneer Nam June Paik takes credit for the term information superhighway, which he used during a study for the Rockefeller Foundation in 1974, long before the existence of Usenet. In 2001, he said, “If you create a highway, then people are going to invent cars. That’s dialectics. If you create electronic highways, something has to happen (The Biz Media, 2010).” Paik’s prediction, made decades before the internet as we know it, proved to be startlingly prescient, showcasing his remarkable foresight and understanding of technological evolution.

Al Gore’s use of the term in the House of Representatives (and later as vice president) had a slightly different meaning and context. To Gore, the Interstate Highway System during the Eisenhower era demonstrated a promise that the government would work to facilitate communication across natural barriers, allowing citizens to utilize these channels to conduct business and communicate with one another. Gore saw the government as playing an essential role in maintaining the pathways of electronic communication. Allowing business interests to get involved would compromise what he saw as a necessarily neutral purpose; a freeway doesn’t judge or demand tolls—it functions as a public service—and neither should the Internet. During his 2000 presidential campaign, critics wrongly ridiculed Gore for supposedly saying that he “invented the Internet.” Still, in reality, his work in the House of Representatives played a crucial part in developing the infrastructure required for Internet access.

Al Gore photo
While Al Gore did not invent the Internet, he played a significant role in popularizing the term’ information superhighway’ and advocating for Internet infrastructure and neutrality. His efforts were instrumental in building support for the Internet as a public utility. Source: Dan FarberAl Gore at Web 2.0 SummitCC BY-NC 2.0.

However, users needed to spend a certain amount of money to get connected to the web. In that regard, AOL paralleled the Model T—the dial-up Internet service put access to the information superhighway within reach of the average person. However, despite the affordability of AOL and the services that succeeded it, specific demographics continued to lack access to the Internet, a problem known as the “digital divide,” which is discussed later in this section.

From the speed of transportation to the credibility of information (don’t trust the stranger at the roadside diner), to the security of information (keep the car doors locked), to net neutrality (toll-free roads), to the possibility of piracy, the metaphor of the information superhighway has proven remarkably apt. All of these issues have played out in various ways, both positively and negatively, and they continue to evolve to this day.

Information Access Like Never Before

In December 2002, a survey by the Pew Internet & American Life Project found that 84 percent of Americans believed they could find information on healthcare, government, news, or shopping on the Internet (Jesdanun, 2002). People expect the Internet to provide virtually unlimited access to educational opportunities. To make this expectation a reality, the Bush Administration’s 2004 report “Toward a New Golden Age in Education: How the Internet, the Law, and Today’s Students are Revolutionizing Expectations” recommended increased broadband Internet access. As Nam June Paik predicted, stringing fiber optics around the world would enable seamless video communication, a development that the Department of Education viewed as integral to its vision of education through technology. The report called for broadband access “24 hours a day, seven days a week, 365 days a year,” saying that it could “help teachers and students realize the full potential of this technology (U.S. Department of Education, 2004).”

The Cloud: Instant Updates, Instant Access

As technology has improved, it has become possible to provide software to users as a service that resides entirely online, rather than on a person’s personal computer. Since people can now stay connected to the Internet constantly, they can utilize online programs to perform all their computing tasks.

“Cloud computing” indicates the process of outsourcing everyday computing tasks to a remote server. The computer attached to the user’s monitor does not perform or store the actual work, but rather it relies on the work of other (possibly many) computers in the “cloud.” As a result, the user’s computer itself does not require a significant amount of processing power; instead of calculating “1 + 1 = 2,” the user’s computer asks the cloud, “What does 1 + 1 equal?” and receives the answer. Meanwhile, the system resources that a computer would usually devote to completing these tasks have become freed up to complete other tasks. In addition, users can store data in the cloud and retrieve it from any computer, making their files more conveniently portable and less vulnerable to hardware failures, such as a hard drive crash. Of course, it can require a significant amount of bandwidth to send these messages back and forth to a remote server in the cloud, and the absence of a reliable, always-on Internet connection can somewhat limit the usefulness of these services.

Cloud computing diagram
Cloud computing allows a computer to contain very little actual information. Many of the programs used by the now-popular “netbooks” store information online. Source: Sam Johnston, Cloud computingCC BY-SA 3.0.

The rise in web applications for mobile devices, such as the iPhone and devices that use Google’s Android operating system, owes its success to cloud computing, which began with 3G networks. These networks could augment the computing power of phones by allowing them to send data elsewhere for processing. For example, a Google Maps application does not calculate the shortest route between two places (taking into account how highways facilitate quicker trips than side roads and numerous other computational difficulties) but instead asks Google to do the calculation and send over the result. 3G networks made this possible in large part because the speed of data transfer surpassed the speed of cell phones’ calculation abilities. As cellular transmission technology continues to improve with the rollout of next-generation 4G networks (the successors to 3G networks), connectivity speeds will further increase, allowing for increasingly comprehensive provisions for multimedia.

Credibility Issues: (Dis)information Superhighway?

The Internet has undoubtedly proven a boon for researchers and writers everywhere. Online services range from up-to-date news and media platforms to vast archives of past writing and scholarship. However, the openness of the Internet means that anyone with a few dollars can set up a credible-looking website and begin disseminating false information.

Taking the authorship of a site into account is a necessary step when evaluating information; more than just identifying false statements, it can provide insight into subtle biases that may arise and suggest further research that needs to be conducted.

Just Trust Me: Bias on the Web

Legacy media journalistic sources typically strive to achieve some balance in their reporting; given reasonable access, they will interview opposing viewpoints and reserve judgment for their editorial pages. Corporate sources will instead present the information in a way that favors their product.

Often, people consider the web a source of entertainment, as well as a means of gathering information. Because of this, sites that rely on advertising may choose to publish something inflammatory that readers will likely link to and forward more for its entertainment value than for its informational qualities.

On the other hand, a website may present itself as a credible source of information about a particular product or topic, with the end goal of selling something. A website that provides advice on how to protect against bedbugs, including a direct link to its product, may not offer the most accurate information on the topic. While much of the web remains free of cost, it is worthwhile to explore how websites sustain their services. If a website offers something for free, the information may have a bias because the money must come from somewhere. The online archive of Consumer Reports requires a subscription to access it. Ostensibly, this subscription revenue enables the service to operate as an impartial judge, serving users rather than advertisers.

Occasionally, corporations may set up “credible” fronts to disseminate information. Climate change remains a contentious topic, and websites about the issue often represent the bias of their owners. For example, the Cato Institute publishes anti-global warming theory columns in many newspapers, including well-respected ones such as The Washington Times. Patrick Basham, an adjunct scholar at the Cato Institute, published the article “Live Earth’s Inconvenient Truths” in the Washington Times on July 11, 2007. Basham writes, “Using normal scientific standards, there is no proof we are causing the Earth to warm, let alone that such warming will cause an environmental catastrophe (Basham, 2007).”

However, the now-defunct website ExxposeExxon.com stated that the Cato Institute received $125,000 from the oil giant ExxonMobil, possibly tainting its data with bias (Exxon, 2006). In addition, Greenpeace ran ExxposeExxon.com as a side project, and the international environmental nonprofit may have its reasons for producing this particular report. The document available on Greenpeace’s site (a scanned version of Exxon’s printout) states that in 2006, the corporation gave $20,000 to the Cato Institute (Greenpeace, 2007) (the other $105,000 was given over the previous decade).

This example highlights the difficulty of finding credible information online, particularly when financial interests are involved. In addition, it shows how conflicting sources may go to great lengths—sorting through a company’s corporate financial reports—to expose what they see as falsehoods. What is the upside to all of this required fact-checking and cross-examination? Before the Internet, this would have likely needed multiple telephone calls and a significant amount of time spent waiting on hold. While the Internet has made false information more widely available, it has also made checking that information incredibly easy.

Wikipedia: The Internet’s Precocious Problem Child

Could a platform provide free information available to all? That sounds like a dream come true—a dream that Wikipedia founder Jimmy Wales shared with the world. Since the site began in 2001, the Wikimedia Foundation (which hosts all of the Wikipedia pages) has become the sixth-most-visited site on the web, barely behind eBay in terms of its unique page views.

Top 10 Global Web Parent Companies, Home and Work. Source: The NIelsen Company

Rank

Parent

Unique Audience (Millions)

Active Reach% %

Time

1

Google

362,006

84.29

2:27:15

2

Microsoft

322,352

75.06

2:53:48

3

Yahoo!

238,035

55.43

1:57:26

4

Facebook

218,861

50.96

6:22:24

5

eBay

163,325

38.03

1:42:46

6

Wikimedia

154,905

36.07

0:15:14

7

AOL LLC

128,147

29.84

2:08:32

8

Amazon

128,071

29.82

0:23:24

9

News Corp.

125,898

29.31

0:53:53

10

InterActiveCorp

122,029

28.41

0:10:52

Organizations had long tried to develop factual content for the web, but Wikipedia went for something else: verifiability. The guidelines for editing Wikipedia state: “What counts is whether readers can verify that material added to Wikipedia has already been published by a reliable source, not whether editors think it is true (Wikipedia).”  The benchmark for inclusion on Wikipedia includes outside citations for any content “likely to be challenged” and for “all quotations.”

While this may seem like it’s a step ahead of many other sources on the Internet, a slight hitch remains: Anyone can edit Wikipedia. This has both positive and negative aspects—though anyone can vandalize the site, anyone can also repair it. In addition, drawing attention to a particularly contentious page can result in one of the site’s administrators placing a warning at the top of the page, stating that the site has not necessarily verified the information presented on it. Other warnings include notices on articles about living persons and articles that may violate Wikipedia’s neutrality policy. This neutrality policy aims to mitigate the extreme views that may be propagated on an open-access page, allowing the community to determine what constitutes a “significant” view that warrants representation (Wikipedia).

As long as users do not take the facts on Wikipedia at face value and verify the information by following up on the relevant sources linked in the articles they read, the site serves as a handy reference tool, providing users with quick access to a wide range of subjects. However, the lack of authorial credit can lead to problems with judging bias and relevance of information, so users must take the same precautions with Wikipedia as with any other online source, primarily when checking references.

Security of Information on the Internet

As the Internet has expanded in scope and the amount of personal information online has increased, securing this information has become a significant concern. The Internet now houses everything from online banking systems to highly personal email messages, and even though security constantly improves, this information remains vulnerable.

A corporation or government can lose control of private information on the Internet. The same protocols that enable open access and communication also facilitate potential exploitation. Like the Interstate Highway System, the Internet offers impartial access to its users.

Can’t Wait: Denial of Service

Although many people increasingly rely on the Internet for communication and access to information, this reliance has come with a hefty price. Most critically, a simple exploit can cause massive roadblocks to Internet traffic, leading to disruptions in commerce, communication, and, as the military continues to rely on the Internet, national security.

Distributed denial-of-service (DDoS) attacks work like cloud computing, but in reverse. Instead of a single computer retrieving data from multiple sources, DDoS refers to a coordinated effort by numerous computers to bring down (or overwhelm) a specific website. Essentially, any web server can only handle a certain amount of information at once. While the most significant and most stable web servers can communicate with a vast number of computers simultaneously, even these have their limits.

A DDoS attack on government servers belonging to both the United States and South Korea in July 2009 rendered many U.S. government sites unavailable to users in Asia for a short time (Gorman & Ramstad, 2009). Although this did not have a significant effect on U.S. cybersecurity, the ease with which attacks could exploit these servers troubled experts. In this case, the DDoS attacks utilized an email virus known as MyDoom, which essentially turned users’ computers into server-attacking “zombies.”

Net Neutrality

Most Internet users in the United States connect through a commercial Internet service provider (ISP). The major players—Comcast, Verizon, Time Warner Cable, AT&T, and others—provide portals to the larger Internet, serving as a way for anyone with a cable line or phone line to receive broadband Internet access through a dedicated data line.

Ideally, ISPs treat all content impartially; any two websites will load at the same speed if they have adequate server capabilities. Service providers wish to modify this arrangement. ISPs have proposed a service model that would allow corporations to pay for a “higher tier” service. For example, this would enable AOL Time Warner to deliver its Hulu service (which Time Warner co-owns with NBC) faster than all other video services, leading to partnerships between Internet content providers and Internet service providers. Service providers also often foot the bill for expanding high-speed Internet access, and they view this new two-tiered service as a way to recoup some of that investment (and, presumably, reinvest the funds received).

Critics fear—and the reason the FCC introduced net neutrality rules—that such a service would hamper the ability of an Internet startup to grow its business. Defenders of net neutrality contend that small companies (those without the ability to forge partnerships with the service providers) would be forced onto a “second-tier” Internet service, and their content would naturally suffer, decreasing inventiveness and competition among Internet content providers.

Before the 1960s, AT&T had permission to restrict its customers to using only its telephones on its networks. In the 1960s, the FCC launched a series of “Computer Inquiries,” stating, in effect, that any customer could use any device on the network, as long as it did not harm the network. This led to inventions such as the fax machine, which would not have been possible under AT&T’s previous agreement.

This proto–net neutrality rule protected innovation even when it “threatened to be a substitute for regulated services (Cannon, 2003).”

Digital Technology and Electronic Media

Content on the Internet competes with content from other media outlets. Unlimited and inexpensive digital duplication of content removes the concept of scarcity from the economic model of media; consumers no longer need to buy a physical CD manufactured by a company to play music, and digital words on a screen convey the news just as effectively as words printed on physical newspapers. The media must reinvent themselves as listeners, readers, and viewers have become increasingly divided into smaller and smaller subcategories.

Traditional media companies have had to evolve to adapt to the changes brought about by the Internet revolution, but these media still retain relevance in an online world. For example, social media can provide a very inexpensive and reliable model for maintaining a band’s following. A record company (or the band itself) can start a Facebook page, through which it can notify all its fans about new albums and tour dates—or even remind fans that it still exists. Social networking enables anyone to discover a band from anywhere in the world, leading to the possibility of diverse and eclectic tastes that are not bound by geography.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Mass Media in a Free Society Copyright © 2024 by North Idaho College is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.