Source Edition – Chapters 10-12
Chapter 10: Electronic Games and Entertainment
10.1 Electronic Games and Entertainment
10.2 The Evolution of Electronic Games
10.3 Influential Contemporary Games
10.4 The Impact of Video Games on Culture
10.5 Controversial Issues
10.6 Blurring the Boundaries Between Video Games, Information, Entertainment, and Communication
10.1 Electronic Games and Entertainment
Want to Get Away?
Anders Adermark – Playing – CC BY-NC-ND 2.0.
Video games have come a long way from using a simple joystick to guide Pac-Man on his mission to find food and avoid ghosts. This is illustrated by a 2007 Southwest Airlines commercial in which two friends are playing a baseball video game on a Nintendo Wii–like console. The batting friend tells the other to throw his pitch—and he does, excitedly firing his controller into the middle of the plasma TV, which then falls off the wall. Both friends stare in shock at the shattered flat screen as the narrator asks, “Want to get away?”
Such a scene is unlikely to have taken place in the early days of video games when Atari reigned supreme and the action of playing solely centered on hand–eye coordination. The learning curve was relatively nonexistent; players maneuvered a joystick to shoot lines of aliens in the sky, or they turned the wheel on a paddle to play a virtual game of table tennis.
But as video games became increasingly popular, they also became increasingly complex. Consoles upgraded and evolved on a regular basis, and the games kept up. Players called each other with loopholes and tips on how to get Mario and Luigi onto the next level, and now they exchange their tricks on gaming blogs. Games like The Legend of Zelda and Final Fantasy created alternate worlds and intricate story lines, providing multiple-hour epic adventures.
Long criticized for taking kids out of the backyard and into a sedentary spot in front of the TV, many video games have circled back to their simpler origins and, in doing so, have made players more active. Casual gamers who could quickly figure out how to put together puzzle pieces in Tetris can now just as easily figure out how to “swing” a tennis racket with a Wiimote. Video games are no longer a convenient scapegoat for America’s obesity problems; Wii Fit offers everything from yoga to boxing, and Dance Dance Revolution estimates calories burned while players dance.
The logistics of video games continue to change, and as they do, gaming has begun to intersect with every other part of culture. Players can learn how to “perform” their favorite songs with Guitar Hero and Rock Band. Product placement akin to what is seen in movies and on TV is equally prevalent in video games such as the popular Forza Motorsport or FIFA series. As the Internet allows for players across the world to participate simultaneously, video games have the potential to one day look like competitive reality shows (Dolan). Arguably, video games even hold a place in the art world, with the increasing complexity of animation and story lines (Tres Kap).
And now, with endless possibilities for the future, video games are attracting new and different demographics. Avid players who grew up with video games may be the first ones to purchase 3-D televisions for the 3-D games of the future (Williams, 2010). But casual players, perhaps of an older demographic, will be drawn to the simplicity of a game like Wii Bowling. Video games have become more accessible than ever. Social media websites like Facebook offer free video game applications, and smartphone users can download apps for as little as a dollar, literally putting video games in one’s back pocket. Who needs a cumbersome Scrabble board when it’s available on a touch screen anytime, anywhere?
Video games have become ubiquitous in modern culture. Understanding them as a medium allows a fuller understanding of their implications in the realms of entertainment, information, and communication. Studying their history reveals new perspectives on the ways video games have affected mainstream culture.
References
Dolan, Michael. “The Video Game Revolution: The Future of Video Gaming,” PBS, http://www.pbs.org/kcts/videogamerevolution/impact/future.html.
Tres Kap, Jona. “The Video Game Revolution: But is it Art?” PBS, http://www.pbs.org/kcts/videogamerevolution/impact/art.html.
Williams, M. H. “Study Shows Casual and Core Gamers Are Ready for 3-D Gaming,” Industry Gamers, June 15, 2010, http://www.industrygamers.com/news/study-shows-casual-and-core-gamers-are-ready-for-3d-gaming/.
10.2 The Evolution of Electronic Games
Learning Objectives
- Identify the major companies involved in video game production.
- Explain the important innovations that drove the acceptance of video games by mainstream culture.
- Determine major technological developments that influenced the evolution of video games.
Pong, the electronic table-tennis simulation game, was the first video game for many people who grew up in the 1970s and is now a famous symbol of early video games. However, the precursors to modern video games were created as early as the 1950s. In 1952 a computer simulation of tic-tac-toe was developed for the Electronic Delay Storage Automatic Calculator (EDSAC), one of the first stored-information computers, and in 1958 a game called Tennis for Two was developed at Brookhaven National Laboratory as a way to entertain people coming through the laboratory on tours (Egenfeldt-Nielsen, 2008).
These games would generate little interest among the modern game-playing public, but at the time they enthralled their users and introduced the basic elements of the cultural video game experience. In a time before personal computers, these games allowed the general public to access technology that had been restricted to the realm of abstract science. Tennis for Two created an interface where anyone with basic motor skills could use a complex machine. The first video games functioned early on as a form of media by essentially disseminating the experience of computer technology to those who did not have access to it.
As video games evolved, their role as a form of media grew as well. Video games have grown from simple tools that made computing technology understandable to forms of media that can communicate cultural values and human relationships.
The 1970s: The Rise of the Video Game
The 1970s saw the rise of video games as a cultural phenomenon. A 1972 article in Rolling Stone describes the early days of computer gaming:
Reliably, at any nighttime moment (i.e. non-business hours) in North America hundreds of computer technicians are effectively out of their bodies, locked in life-or-death space combat computer-projected onto cathode ray tube display screens, for hours at a time, ruining their eyes, numbing their fingers in frenzied mashing of control buttons, joyously slaying their friend and wasting their employers’ valuable computer time. Something basic is going on (Brand, 1972).
This scene was describing Spacewar!, a game developed in the 1960s at the Massachusetts Institute of Technology (MIT) that spread to other college campuses and computing centers. In the early ’70s, very few people owned computers. Most computer users worked or studied at university, business, or government facilities. Those with access to computers were quick to utilize them for gaming purposes.
Arcade Games
The first coin-operated arcade game was modeled on Spacewar! It was called Computer Space, and it fared poorly among the general public because of its difficult controls. In 1972, Pong, the table-tennis simulator that has come to symbolize early computer games, was created by the fledgling company Atari, and it was immediately successful. Pong was initially placed in bars with pinball machines and other games of chance, but as video games grew in popularity, they were placed in any establishment that would take them. By the end of the 1970s, so many video arcades were being built that some towns passed zoning laws limiting them (Kent, 1997).
The end of the 1970s ushered in a new era—what some call the golden age of video games—with the game Space Invaders, an international phenomenon that exceeded all expectations. In Japan, the game was so popular that it caused a national coin shortage. Games like Space Invaders illustrate both the effect of arcade games and their influence on international culture. In two different countries on opposite sides of the globe, Japanese and American teenagers, although they could not speak to one another, were having the same experiences thanks to a video game.
Video Game Consoles
The first video game console for the home began selling in 1972. It was the Magnavox Odyssey, and it was based on prototypes built by Ralph Behr in the late 1960s. This system included a Pong-type game, and when the arcade version of Pong became popular, the Odyssey began to sell well. Atari, which was making arcade games at the time, decided to produce a home version of Pong and released it in 1974. Although this system could only play one game, its graphics and controls were superior to the Odyssey, and it was sold through a major department store, Sears. Because of these advantages, the Atari home version of Pong sold well, and a host of other companies began producing and selling their own versions of Pong (Herman, 2008).
A major step forward in the evolution of video games was the development of game cartridges that stored the games and could be interchanged in the console. With this technology, users were no longer limited to a set number of games, leading many video game console makers to switch their emphasis to producing games. Several groups, such as Magnavox, Coleco, and Fairchild, released versions of cartridge-type consoles, but Atari’s 2600 console had the upper hand because of the company’s work on arcade games. Atari capitalized off of its arcade successes by releasing games that were well known to a public that was frequenting arcades. The popularity of games such as Space Invaders and Pac-Man made the Atari 2600 a successful system. The late 1970s also saw the birth of companies such as Activision, which developed third-party games for the Atari 2600 (Wolf).
Home Computers
The birth of the home computer market in the 1970s paralleled the emergence of video game consoles. The first computer designed and sold for the home consumer was the Altair. It was first sold in 1975, several years after video game consoles had been selling, and it sold mainly to a hobbyist market. During this period, people such as Steve Jobs, the founder of Apple, were building computers by hand and selling them to get their start-up businesses going. In 1977, three important computers—Radio Shack’s TRS-80, the Commodore PET, and the Apple II—were produced and began selling to the home market (Reimer, 2005).
The rise of personal computers allowed for the development of more complex games. Designers of games such as Mystery House, developed in 1979 for the Apple II, and Rogue, developed in 1980 and played on IBM PCs, used the processing power of early home computers to develop video games that had extended plots and story lines. In these games, players moved through landscapes composed of basic graphics, solving problems and working through an involved narrative. The development of video games for the personal computer platform expanded the ability of video games to act as media by allowing complex stories to be told and new forms of interaction to take place between players.
The 1980s: The Crash
Atari’s success in the home console market was due in large part to its ownership of already-popular arcade games and the large number of game cartridges available for the system. These strengths, however, eventually proved detrimental to the company and led to what is now known as the video game crash of 1983. Atari bet heavily on its past successes with popular arcade games by releasing Pac-Man for the Atari 2600. Pac-Man was a successful arcade game that did not translate well to the home console, leading to disappointed consumers and lower-than-expected sales. Additionally, Atari produced 10 million of the lackluster Pac-Man games on its first run, despite the fact that active consoles were only estimated at 10 million. Similar mistakes were made with a game based on the movie E.T.: The Extra-Terrestrial, which has gained notoriety as one of the worst games in Atari’s history. It was not received well by consumers despite the success of the movie, and Atari had again bet heavily on its success. Piles of unsold E.T. game cartridges were reportedly buried in the New Mexico desert under a veil of secrecy (Monfort & Bogost, 2009).
As retail outlets became increasingly wary of home console failures, they began stocking fewer games on shelves. This action, combined with an increasing number of companies producing games, led to overproduction and a resulting fallout in the video game market in 1983. Many smaller game developers did not have the capacity to withstand this downturn and went out of business. Although Coleco and Atari were able to make it through the crash, neither company regained its former share of the video game market. It was 1985 when the video game market picked up again.
The Rise of Nintendo
Nintendo, a Japanese card and novelty producer that had begun to produce electronic games in the 1970s, was responsible for arcade games such as Donkey Kong in the early 1980s. Its first home console, developed in 1984 for sale in Japan, tried to succeed where Atari had failed. The Nintendo system used newer, better microchips, bought in large quantities, to ensure high-quality graphics at a price consumers could afford. Keeping console prices low meant Nintendo had to rely on games for most of its profits and maintain control of game production. This was something Atari had failed to do, and it led to a glut of low-priced games that caused the crash of 1983. Nintendo got around this problem with proprietary circuits that would not allow unlicensed games to be played on the console. This allowed Nintendo to dominate the home video game market through the end of the decade, when one-third of homes in the United States had a Nintendo system (Cross & Smits, 2005).
Nintendo introduced its Nintendo Entertainment System (NES) in the United States in 1985. The game Super Mario Brothers, released with the system, was also a landmark in video game development. The game employed a narrative in the same manner as more complicated computer games, but its controls were accessible and its objectives simple. The game appealed to a younger demographic, generally boys in the 8–14 range, than the one targeted by Atari (Kline, et. al., 2003). Its designer, Shigeru Miyamoto, tried to mimic the experiences of childhood adventures, creating a fantasy world not based on previous models of science fiction or other literary genres (McLaughlin, 2007). Super Mario Brothers also gave Nintendo an iconic character who has been used in numerous other games, television shows, and even a movie. The development of this type of character and fantasy world became the norm for video game makers. Games such as The Legend of Zelda became franchises with film and television possibilities rather than simply one-off games.
As video games developed as a form of media, the public struggled to come to grips with the kind of messages this medium was passing on to children. These were no longer simple games of reflex that could be compared to similar nonvideo games or sports; these were forms of media that included stories and messages that concerned parents and children’s advocates. Arguments about the larger meaning of the games became common, with some seeing the games as driven by ideas of conquest and gender stereotypes, whereas others saw basic stories about traveling and exploration (Fuller & Jenkins, 1995).
Other Home Console Systems
Other software companies were still interested in the home console market in the mid-1980s. Atari released the 2600jr and the 7800 in 1986 after Nintendo’s success, but the consoles could not compete with Nintendo. The Sega Corporation, which had been involved with arcade video game production, released its Sega Master System in 1986. Although the system had more graphics possibilities than the NES, Sega failed to make a dent in Nintendo’s market share until the early 1990s, with the release of Sega Genesis (Kerr, 2005).
Computer Games Flourish and Innovate
The enormous number of games available for Atari consoles in the early 1980s took its toll on video arcades. In 1983, arcade revenues had fallen to a 3-year low, leading game makers to turn to newer technologies that could not be replicated by home consoles. This included arcade games powered by laser discs, such as Dragon’s Lair and Space Ace, but their novelty soon wore off, and laser-disc games became museum pieces (Harmetz, 1984). In 1989, museums were already putting on exhibitions of early arcade games that included ones from the early 1980s. Although newer games continued to come out on arcade platforms, they could not compete with the home console market and never achieved their previous successes from the early 1980s. Increasingly, arcade gamers chose to stay at home to play games on computers and consoles. Today, dedicated arcades are a dying breed. Most that remain, like the Dave & Buster’s and Chuck E. Cheese’s chains, offer full-service restaurants and other entertainment attractions to draw in business.
Home games fared better than arcades because they could ride the wave of personal computer purchases that occurred in the 1980s. Some important developments in video games occurred in the mid-1980s with the development of online games. Multiuser dungeons, or MUDs, were role-playing games played online by multiple users at once. The games were generally text-based, describing the world of the MUD through text rather than illustrating it through graphics. The games allowed users to create a character and move through different worlds, accomplishing goals that awarded them with new skills. If characters attained a certain level of proficiency, they could then design their own area of the world. Habitat, a game developed in 1986 for the Commodore 64, was a graphic version of this type of game. Users dialed up on modems to a central host server and then controlled characters on screen, interacting with other users (Reimer, 2005).
During the mid-1980s, a demographic shift occurred. Between 1985 and 1987, games designed to run on business computers rose from 15 percent to 40 percent of games sold (Elmer-Dewitt, et. al., 1987). This trend meant that game makers could use the increased processing power of business computers to create more complex games. It also meant adults were interested in computer games and could become a profitable market.
The 1990s: The Rapid Evolution of Video Games
Video games evolved at a rapid rate throughout the 1990s, moving from the first 16-bit systems (named for the amount of data they could process and store) in the early 1990s to the first Internet-enabled home console in 1999. As companies focused on new marketing strategies, wider audiences were targeted, and video games’ influence on culture began to be felt.
Console Wars
Nintendo’s dominance of the home console market throughout the late 1980s allowed it to build a large library of games for use on the NES. This also proved to be a weakness, however, because Nintendo was reluctant to improve or change its system for fear of making its game library obsolete. Technology had changed in the years since the introduction of the NES, and companies such as NEC and Sega were ready to challenge Nintendo with 16-bit systems (Slaven).
The Sega Master System had failed to challenge the NES, but with the release of its 16-bit system, Sega Genesis, the company pursued a new marketing strategy. Whereas Nintendo targeted 8- to 14-year-olds, Sega’s marketing plan targeted 15- to 17-year olds, making games that were more mature and advertising during programs such as the MTV Video Music Awards. The campaign successfully branded Sega as a cooler version of Nintendo and moved mainstream video games into a more mature arena. Nintendo responded to the Sega Genesis with its own 16-bit system, the Super NES, and began creating more mature games as well. Games such as Sega’s Mortal Kombat and Nintendo’s Street Fighter competed to raise the level of violence possible in a video game. Sega’s advertisements even suggested that its game was better because of its more violent possibilities (Gamespot).
By 1994, companies such as 3DO, with its 32-bit system, and Atari, with its allegedly 64-bit Jaguar, attempted to get in on the home console market but failed to use effective marketing strategies to back up their products. Both systems fell out of production before the end of the decade. Sega, fearing that its system would become obsolete, released the 32-bit Saturn system in 1995. The system was rushed into production and did not have enough games available to ensure its success (Cyberia PC). Sony stepped in with its PlayStation console at a time when Sega’s Saturn was floundering and before Nintendo’s 64-bit system had been released. This system targeted an even older demographic of 14- to 24-year-olds and made a large effect on the market; by March of 2007, Sony had sold 102 million PlayStations (Edge Staff, 2009).
Computer Games Gain Mainstream Acceptance
Computer games had avid players, but they were still a niche market in the early 1990s. An important step in the mainstream acceptance of personal computer games was the development of the first-person shooter genre. First popularized by the 1992 game Wolfenstein 3D, these games put the player in the character’s perspective, making it seem as if the player were firing weapons and being attacked. Doom, released in 1993, and Quake, released in 1996, used the increased processing power of personal computers to create vivid three-dimensional worlds that were impossible to fully replicate on video game consoles of the era. These games pushed realism to new heights and began attracting public attention for their graphic violence.
Another trend was reaching out to audiences outside of the video-game-playing community. Myst, an adventure game where the player walked around an island solving a mystery, drove sales of CD-ROM drives for computers. Myst, its sequel Riven, and other nonviolent games such as SimCity actually outsold Doom and Quake in the 1990s (Miller, 1999). These nonviolent games appealed to people who did not generally play video games, increasing the form’s audience and expanding the types of information that video games put across.
Online Gaming Gains Popularity
A major advance in game technology came with the increase in Internet use by the general public in the 1990s. A major feature of Doom was the ability to use multiplayer gaming through the Internet. Strategy games such as Command and Conquer and Total Annihilation also included options where players could play each other over the Internet. Other fantasy-inspired role-playing games, such as Ultima Online, used the Internet to initiate the massively multiplayer online role-playing game (MMORPG) genre (Reimer). These games used the Internet as their platform, much like the text-based MUDs, creating a space where individuals could play the game while socially interacting with one another.
Portable Game Systems
The development of portable game systems was another important aspect of video games during the 1990s. Handheld games had been in use since the 1970s, and a system with interchangeable cartridges had even been sold in the early 1980s. Nintendo released the Game Boy in 1989, using the same principles that made the NES dominate the handheld market throughout the 1990s. The Game Boy was released with the game Tetris, using the game’s popularity to drive purchases of the unit. The unit’s simple design meant users could get 20 hours of playing time on a set of batteries, and this basic design was left essentially unaltered for most of the decade. More advanced handheld systems, such as the Atari Lynx and Sega Game Gear, could not compete with the Game Boy despite their superior graphics and color displays (Hutsko, 2000).
The decade-long success of the Game Boy belies the conventional wisdom of the console wars that more advanced technology makes for a more popular system. The Game Boy’s static, simple design was readily accessible, and its stability allowed for a large library of games to be developed for it. Despite using technology almost a decade old, the Game Boy accounted for 30 percent of Nintendo of America’s overall revenues at the end of the 1990s (Hutsko, 2000).
The Early 2000s: 21st-Century Games
The Console Wars Continue
Sega gave its final effort in the console wars with the Sega Dreamcast in 1999. This console could connect to the Internet, emulating the sophisticated computer games of the 1990s. The new features of the Sega Dreamcast were not enough to save the brand, however, and Sega discontinued production in 2001, leaving the console market entirely (Business Week).
A major problem for Sega’s Dreamcast was Sony’s release of the PlayStation 2 (PS2) in 2000. The PS2 could function as a DVD player, expanding the role of the console into an entertainment device that did more than play video games. This console was incredibly successful, enjoying a long production run, with more than 106 million units sold worldwide by the end of the decade (A Brief History of Game Console Warfare).
In 2001, two major consoles were released to compete with the PS2: the Xbox and the Nintendo GameCube. The Xbox was an attempt by Microsoft to enter the market with a console that expanded on the functions of other game consoles. The unit had features similar to a PC, including a hard drive and an Ethernet port for online play through its service, Xbox Live. The popularity of the first-person shooter game Halo, an Xbox exclusive release, boosted sales as well. Nintendo’s GameCube did not offer DVD playback capabilities, choosing instead to focus on gaming functions. Both of these consoles sold millions of units but did not come close to the sales of the PS2.
Computer Gaming Becomes a Niche Market
As consoles developed to rival the capabilities of personal computers, game developers began to focus more on games for consoles. From 2000 to the end of the decade, the popularity of personal computer games has gradually declined. The computer gaming community, while still significant, is focused on game players who are willing to pay a lot of money on personal computers that are designed specifically for gaming, often including multiple monitors and user modifications that allow personal computers to play newer games. This type of market, though profitable, is not large enough to compete with the audience for the much cheaper game consoles (Kalning, 2008).
The Evolution of Portable Gaming
Nintendo continued its control of the handheld game market into the 2000s with the 2001 release of the Game Boy Advance, a redesigned Game Boy that offered 32-bit processing and compatibility with older Game Boy games. In 2004, anticipating Sony’s upcoming handheld console, Nintendo released the Nintendo DS, a handheld console that featured two screens and Wi-Fi capabilities for online gaming. Sony’s PlayStation Portable (PSP) was released the following year and featured Wi-Fi capabilities as well as a flexible platform that could be used to play other media such as MP3s (Patsuris, 2004). These two consoles, along with their newer versions, continue to dominate the handheld market
One interesting innovation in mobile gaming occurred in 2003 with the release of the Nokia N-Gage. The N-Gage was a combination of a game console and mobile phone that, according to consumers, did not fill either role very well. The product line was discontinued in 2005, but the idea of playing games on phones persisted and has been developed on other platforms (Stone, 2007). Apple currently dominates the industry of mobile phone games; in 2008 and 2009 alone, iPhone games generated $615 million in revenue (Farago, 2010). As mobile phone gaming grows in popularity and as the supporting technology becomes increasingly more advanced, traditional portable gaming platforms like the DS and the PSP will need to evolve to compete. Nintendo is already planning a successor to the DS that features 3-D graphics without the use of 3-D glasses that it hopes will help the company retain and grow its share of the portable gaming market.
Video Games Today
The trends of the late 2000s have shown a steadily increasing market for video games. Newer control systems and family-oriented games have made it common for many families to engage in video game play as a group. Online games have continued to develop, gaining unprecedented numbers of players. The overall effect of these innovations has been the increasing acceptance of video game culture by the mainstream.
Home Consoles
The current state of the home console market still involves the three major companies of the past 10 years: Nintendo, Sony, and Microsoft. The release of Microsoft’s Xbox 360 led this generation of consoles in 2005. The Xbox 360 featured expanded media capabilities and integrated access to Xbox Live, an online gaming service. Sony’s PlayStation 3 (PS3) was released in 2006. It also featured enhanced online access as well as expanded multimedia functions, with the additional capacity to play Blu-ray discs. Nintendo released the Wii at the same time. This console featured a motion-sensitive controller that departed from previous controllers and focused on accessible, often family-oriented games. This combination successfully brought in large numbers of new game players, including many older adults. By June 2010, in the United States, the Wii had sold 71.9 million units, the Xbox 360 had sold 40.3 million, and the PS3 trailed at 35.4 million (VGChartz, 2010). In the wake of the Wii’s success, Microsoft and Sony have introduced their own motion-sensitive systems (Mangalindan, 2010).
Key Takeaways
- In a time before personal computers, early video games allowed the general public to access technology that had been restricted to the realm of abstract science. Tennis for Two created an interface where anyone with basic motor skills could use a complex machine. The first video games functioned early on as a form of media by essentially disseminating the experience of computer technology to those without access to it.
- Video games reached wider audiences in the 1990s with the advent of the first-person shooter genre and popular nonaction games such as Myst. The games were marketed to older audiences, and their success increased demand for similar games.
- Online capabilities that developed in the 1990s and expanded in the 2000s allowed players to compete in teams. This innovation attracted larger audiences to gaming and led to new means of social communication.
- A new generation of accessible, family-oriented games in the late 2000s encouraged families to interact through video games. These games also brought in older demographics that had never used video games before.
Exercises
Video game marketing has changed to bring in more and more people to the video game audience. Think about the influence video games have had on you or people you know. If you have never played video games, then think about the ways your conceptions of video games have changed. Sketch out a timeline indicating the different occurrences that marked your experiences related to video games. Now compare this timeline to the history of video games from this section. Consider the following questions:
- How did your own experiences line up with the history of video games?
- Did you feel the effects of marketing campaigns directed at you or those around you?
- Were you introduced to video games during a surge in popularity? What games appealed to you?
References
A Brief History of Game Console Warfare, “PlayStation 2,” slide in “A Brief History of Game Console Warfare.”
Brand, Stewart. “Space War,” Rolling Stone, December 7, 1972.
Business Week, “Sega Dreamcast,” slide in “A Brief History of Game Console Warfare,” Business Week, http://images.businessweek.com/ss/06/10/game_consoles/.
Cross, Gary and Gregory Smits, “Japan, the U.S. and the Globalization of Children’s Consumer Culture,” Journal of Social History 38, no. 4 (2005).
CyberiaPC.com, “Sega Saturn (History, Specs, Pictures),” http://www.cyberiapc.com/vgg/sega_saturn.htm.
Edge staff, “The Making Of: Playstation,” Edge, April 24, 2009, http://www.next-gen.biz/features/the-making-of-playstation.
Egenfeldt-Nielsen, Simon. Understanding Video Games: The Essential Introduction (New York: Taylor & Francis, 2008), 50.
Elmer-Dewitt, Philip and others, “Computers: Games that Grownups Play,” Time, July 27, 1987, http://www.time.com/time/magazine/article/0,9171,965090,00.html.
Farago, Peter. “Apple iPhone and iPod Touch Capture U.S. Video Game Market Share,” Flurry (blog), March 22, 2010, http://blog.flurry.com/bid/31566/Apple-iPhone-and-iPod-touch-Capture-U-S-Video-Game-Market-Share.
Fuller, Mary and Henry Jenkins, “Nintendo and New World Travel Writing: A Dialogue,” Cybersociety: Computer-Mediated Communication and Community, ed. Steven G. Jones (Thousand Oaks, CA: Sage Publications, 1995), 57–72.
Gamespot, “When Two Tribes Go to War: A History of Video Game Controversy,” http://www.gamespot.com/features/6090892/p-5.html.
Harmetz, Aljean. “Video Arcades Turn to Laser Technology as Queues Dwindle,” Morning Herald (Sydney), February 2, 1984.
Herman, Leonard. “Early Home Video Game Systems,” in The Video Game Explosion: From Pong to PlayStation and Beyond, ed. Mark Wolf (Westport, CT: Greenwood Press, 2008), 54.
Hutsko, Joe. “88 Million and Counting; Nintendo Remains King of the Handheld Game Players,” New York Times, March 25, 2000, http://www.nytimes.com/2000/03/25/business/88-million-and-counting-nintendo-remains-king-of-the-handheld-game-players.html.
Kalning, Kristin. “Is PC Gaming Dying? Or Thriving?” MSNBC, March 26, 2008, http://www.msnbc.msn.com/id/23800152/wid/11915773/.
Kent, “Super Mario Nation.”
Kent, Steven. “Super Mario Nation,” American Heritage, September 1997, http://www.americanheritage.com/articles/magazine/ah/1997/5/1997_5_65.shtml.
Kerr, Aphra. “Spilling Hot Coffee? Grand Theft Auto as Contested Cultural Product,” in The Meaning and Culture of Grand Theft Auto: Critical Essays, ed. Nate Garrelts (Jefferson, NC: McFarland, 2005), 17.
Kline, Stephen, Nick Dyer-Witheford, and Greig De Peuter, Digital Play: The Interaction of Technology, Culture, and Marketing (Montreal: McGill-Queen’s University Press, 2003), 119.
Mangalindan, J. P. “Is Casual Gaming Destroying the Traditional Gaming Market?” Fortune, March 18, 2010, http://tech.fortune.cnn.com/2010/03/18/is-casual-gaming-destroying-the-traditional-gaming-market/.
McLaughlin, Rus. “IGN Presents the History of Super Mario Bros.,” IGN Retro, November 8, 2007, http://games.ign.com/articles/833/833615p1.html.
Miller, Stephen C. “News Watch; Most-Violent Video Games Are Not Biggest Sellers,” New York Times, July 29, 1999, http://www.nytimes.com/1999/07/29/technology/news-watch-most-violent-video-games-are-not-biggest-sellers.html.
Montfort, Nick and Ian Bogost, Racing the Beam: The Atari Video Computer System (Cambridge, MA: MIT Press, 2009), 127.
Patsuris, Penelope. “Sony PSP vs. Nintendo DS,” Forbes, June 7, 2004, http://www.forbes.com/2004/06/07/cx_pp_0607mondaymatchup.html.
Reimer, “The Evolution of Gaming.”
Reimer, Jeremy. “The Evolution of Gaming: Computers, Consoles, and Arcade,” Ars Technica (blog), October 10, 2005, http://arstechnica.com/old/content/2005/10/gaming-evolution.ars/4.
Reimer, Jeremy. “Total share: 30 years of personal computer market share figures,” Ars Technica (blog), December 14, 2005, http://arstechnica.com/old/content/2005/12/total-share.ars/2.
Slaven, Andy. Video Game Bible, 1985–2002, (Victoria, BC: Trafford), 70–71.
Stone, Brad. “Play It Again, Nokia. For the Third Time,” New York Times, August 27, 2007, http://www.nytimes.com/2007/08/27/technology/27nokia.html.
VGChartz, “Weekly Hardware Chart: 19th June 2010,” http://www.vgchartz.com.
Wolf, Mark J. P. “Arcade Games of the 1970s,” in The Video Game Explosion (see note 7), 41.
10.3 Influential Contemporary Games
Learning Objectives
- Identify the effect electronic games have on culture.
- Select the most influential and important games released to the general public within the last few years.
- Describe ways new games have changed the video game as a form of media.
With such a short history, the place of video games in culture is constantly changing and being redefined. Are video games entertainment or art? Should they focus on fostering real-life skills or developing virtual realities? Certain games have come to prominence in recent years for their innovations and genre-expanding attributes. These games are notable for not only great economic success and popularity but also for having a visible influence on culture.
Guitar Hero and Rock Band
The musical series Guitar Hero, based on a Japanese arcade game of the late ’90s, was first launched in North America in 2005. In the game, the player uses a guitar-shaped controller to match the rhythms and notes of famous rock songs. The closer the player approximates the song, the better the score. This game introduced a new genre of games in which players simulate playing musical instruments. Rock Band, released in 2007, uses a similar format, including a microphone for singing, a drum set, and rhythm and bass guitars. These games are based on a similar premise as earlier rhythm-based games such as Dance Dance Revolution, in which players keep the rhythm on a dance pad. Dance Dance Revolution, which was introduced to North American audiences in 1999, was successful but not to the extent that the later band-oriented games were. In 2008, music-based games brought in an estimated $1.9 billion.
Guitar Hero and Rock Band brought new means of marketing and a kind of cross-media stimulus with them. The songs featured in the games experienced increased downloads and sales—as much as an 840 percent increase in some cases (Peckham, 2008). The potential of this type of game did not escape its developers or the music industry. Games dedicated solely to one band were developed, such as Guitar Hero: Aerosmith and The Beatles: Rock Band. These games were a mix of music documentary, greatest hits album, and game. They included footage from early concerts, interviews with band members, and, of course, songs that allowed users to play along. When Guitar Hero: Aerosmith was released, the band’s catalog experienced a 40 percent increase in sales (Quan, 2008).
The rock band Metallica made its album Death Magnetic available for Guitar Hero III on the same day it was released as an album (Quan, 2008). Other innovations include Rock Band Network, a means for bands and individuals to create versions of their own songs for Rock Band that can be downloaded for a fee. The sporadic history of the video game industry makes it unclear if this type of game will maintain market share or even maintain its popularity, but it has clearly opened new avenues of expression as a form of media.
The Grand Theft Auto series
The first game in the Grand Theft Auto (GTA) series was released in 1997 for the PC and Sony PlayStation. The game had players stealing cars—not surprising given its title—and committing a variety of crimes to achieve specific goals. The game’s extreme violence made it popular with players of the late 1990s, but its true draw was the variety of options that players could employ in the game. Specific narratives and goals could be pursued, but if players wanted to drive around and explore the city, they could do that as well. A large variety of cars, from sports cars to tractor trailers, were available depending on the player’s goals. The violence could likewise be taken to any extreme the player wished, including stealing cars, killing pedestrians, and engaging the police in a shoot-out. This type of game is known as a sandbox game, or open world, and it is defined by the ability of users to freely pursue their own objectives (Donald, 2000).
The GTA series has evolved over the past decade by increasing the realism, options, and explicit content of the first game. GTA III and GTA IV, as well as a number of spin-off games, such as the recent addition The Ballad of Gay Tony, have made the franchise more profitable and more controversial. These newer games have expanded on the idea of an open video game world, allowing players to have their characters buy and manage businesses, play unrelated mini-games (such as bowling and darts), and listen to a wide variety of in-game music, talk shows, and even television programs. However, increasing freedom also results in increasing controversy, as players can choose to solicit prostitutes, visit strip clubs, perform murder sprees, and assault law enforcement agents. Lawsuits have attempted to tie the games to real-life instances of violence, and GTA games are routinely the target of political investigations into video game violence (Morales, 2005).
World of Warcraft
World of Warcraft (WoW), released in 2004, is a massively multiplayer online role-playing game (MMORPG) loosely based on the Warcraft strategy franchise of the 1990s. The game is conducted entirely online, though it is accessed through purchased software, and players purchase playing time. Each player chooses an avatar, or character, that belongs to one of several races, such as orcs, elves, and humans. These characters can spend their time on the game by completing quests, learning trades, or simply interacting with other characters. As characters gain experience, they obtain skills and earn virtual money. Players also choose whether they can attack other players without prior agreement by choosing a PvP (player versus player) server. The normal server allows players to fight each other, but it can only be done if both players consent. A third server is reserved for those players who want to role-play, or act in character.
Various organizations have sprung up within the WoW universe. Guilds are groups that ascribe to specific codes of conduct and work together to complete tasks that cannot be accomplished by a lone individual. The guilds are organized by the players; they are not maintained by WoW developers. Each has its own unique identity and social rules, much like a college fraternity or social club. Voice communication technology allows players to speak to each other as they complete missions and increases the social bonding that holds such organizations together (Barker, 2006).
WoW has taken the medium of video games to unprecedented levels. Although series such as Grand Theft Auto allow players a great deal of freedom, everything done in the games was accounted for at some point. WoW, which depends on the actions of millions of players to drive the game, allows people to literally live their lives through a game. In the game, players can earn virtual gold by mining it, killing enemies, and killing other players. It takes a great deal of time to accumulate gold in this manner, so many wealthy players choose to buy this gold with actual dollars. This is technically against the rules of the game, but these rules are unenforceable. Entire real-world industries have developed from this trade in gold. Chinese companies employ workers, or “gold farmers,” who work 10-hour shifts finding gold in WoW so that the company can sell it to clients. Other players make money by finding deals on virtual goods and then selling them for a profit. One WoW player even “traveled” to Asian servers to take advantage of cheap prices, conducting a virtual import–export business (Davis, 2009).
The unlimited possibilities in such a game expand the idea of what a game is. It is obvious that an individual who buys a video game, takes it home, and plays it during his or her leisure is, in fact, playing a game. But if that person is a “gold farmer” doing repetitious tasks in a virtual world to make a real-world living, the situation is not as clear. WoW challenges conventional notions of what a game is by allowing the players to create their own goals. To some players, the goal may be to gain a high level for their character; others may be interested in role-playing, whereas others are focused on making a profit. This kind of flexibility leads to the development of scenarios never before encountered in game-play, such as the development of economic classes.
Call of Duty: Modern Warfare
The Call of Duty series of first-person shooter games is notable for its record-breaking success in the video game market, generating more than $3 billion in retail sales through late 2009 (Ivan, 2009). Call of Duty: Modern Warfare 2 was released in 2009 to critical acclaim and a great deal of controversy. The game included a 5-minute sequence in which the player, as a CIA agent infiltrating a terrorist cell, takes part in a massacre of innocent civilians. The player was not required to shoot civilians and could skip the sequence if desired, but these options did not stop international attention and calls to ban the game (Games Radar, 2009). Proponents of the series argue that Call of Duty has a Mature rating and is not meant to be played by minors. They also point out that the games are less violent than many modern movies. However, the debate has continued, escalating as far as the United Kingdom’s House of Commons (Games Radar, 2009).
Wii Sports and Wii Fit
The Nintendo Wii, with its dedicated motion-sensitive controller, was sold starting in 2006. The company had attempted to implement similar controllers in the past, including the Power Glove in 1989, but it had never based an entire console around such a device. The Wii’s simple design was combined with basic games such as Wii Sports to appeal to previously untapped audiences. Wii Sports was included with purchase of the Wii console and served as a means to demonstrate the new technology. It included five games: baseball, bowling, boxing, tennis, and golf. Wii Sports created a way for group play without the need for familiarity with video games. It was closer to outdoor social games such as horseshoes or croquet than it was to Doom. There was also nothing objectionable about it: no violence, no in-your-face intensity—just a game that even older people could access and enjoy. Wii Bowling tournaments were sometimes organized by retirement communities, and many people found the game to be a new way to socialize with their friends and families (Wischnowsky, 2007).
Wii Fit combined the previously incompatible terms “fitness” and “video games.” Using a touch-sensitive platform, players could do aerobics, strength training, and yoga. The game kept track of players’ weights, acting as a kind of virtual trainer (Vella, 2008).Wii Fit used the potential of video games to create an interactive version of an exercise machine, replacing workout videos and other forms of fitness that had never before considered Nintendo a competitor. This kind of design used the inherent strengths of video games to create a new kind of experience.
Nintendo found most of its past success marketing to younger demographics with games that were less controversial than the 1990s first-person shooters. Wii Sports and Wii Fit saw Nintendo playing to its strengths and expanding on them with family-friendly games that encouraged multiple generations to use video games as a social platform. This campaign was so successful that it is being imitated by rival companies Sony and Microsoft, which have released the Sony PlayStation Move and the Microsoft Kinect.
Key Takeaways
- Guitar Hero and Rock Band created a new means of marketing for the music industry. Music featured on the games experienced increased downloads and sales.
- The Grand Theft Auto series was revolutionary and controversial for its open-ended field. Players could choose from a number of different options, allowing them to set their own goals and create their own version of the game.
- World of Warcraft has brought the MMORPG genre to new heights of popularity. The large number of users has made the game evolve to a level of complexity unheard of in video games.
- Wii Sports and Wii Fit brought video games to audiences that had never been tapped by game makers. Older adults and families used Wii Sports as a means of family bonding, and Wii Fit took advantage of motion-controlled game playing to enter the fitness market.
Exercises
Think about the ways in which the games from this section were innovative and groundbreaking. Consider the following questions:
- What new social, technological, or cultural areas did they explore?
- Pick one of these areas—social, technological, or cultural—and write down ways in which video games could have a future influence in this area.
References
Barker, Carson. “Team Players: Guilds Take the Lonesome Gamer Out of Seclusion…Kind Of,” Austin Chronicle, July 28, 2006, http://www.austinchronicle.com/gyrobase/Issue/story?oid=oid%3A390551.
Davis, Rowenna. “Welcome to the New Gold Mines,” Guardian (London), March 5, 2009, http://www.guardian.co.uk/technology/2009/mar/05/virtual-world-china?intcmp=239.
Donald, Ryan. review of Grand Theft Auto (PlayStation), CNET, 28 April 2000, http://reviews.cnet.com/legacy-game-platforms/grand-theft-auto-playstation/4505-9882_7-30971409-2.html.
Games Radar, “The Decade in Gaming: The 10 Most Shocking Moments of the Decade,” December 29, 2009, http://www.gamesradar.com/f/the-10-most-shocking-game-moments-of-the-decade/a-20091221122845427051/p-2.
Ivan, Tom. “Call of Duty Series Tops 55 Million Sales,” Edge, November 27, 2009, http://www.edge-online.com/news/call-of-duty-series-tops-55-million-sales.
Morales, Tatiana. “Grand Theft Auto Under Fire,” CBS News, July 14, 2005, http://www.cbsnews.com/stories/2005/07/13/earlyshow/living/parenting/main708794.shtml.
Peckham, Matt. Music Sales Rejuvenated by Rock Band, Guitar Hero,” PC World, December 22, 2008, http://www.washingtonpost.com/wp-dyn/content/article/2008/12/22/AR2008122200798.html.
Quan, Denise. “Is ‘Guitar Hero’ Saving Rock ’n’ Roll?” CNN, August 28, 2008, http://www.cnn.com/2008/SHOWBIZ/Music/08/20/videol.games.music/.
Vella, Matt. “Wii Fit Puts the Fun in Fitness,” Business Week, May 21, 2008, http://www.businessweek.com/innovate/content/may2008/id20080520_180427.htm.
Wischnowsky, Dave. “Wii Bowling Knocks Over Retirement Home,” Chicago Tribune, February 16, 2007, http://www.chicagotribune.com/news/local/chi-070216nintendo,0,2755896.story.
10.4 The Impact of Video Games on Culture
Learning Objectives
- Describe gaming culture and how it has influenced mainstream culture.
- Analyze the ways video games have affected other forms of media.
- Describe how video games can be used for educational purposes.
- Identify the arguments for and against the depiction of video games as an art.
An NPD poll conducted in 2007 found that 72 percent of the U.S. population had played a video game that year (Faylor, 2008). The increasing number of people playing video games means that video games are having an undeniable effect on culture. This effect is clearly visible in the increasing mainstream acceptance of aspects of gaming culture. Video games have also changed the way that many other forms of media, from music to film, are produced and consumed. Education has also been changed by video games through the use of new technologies that help teachers and students communicate in new ways through educational games such as Brain Age. As video games have an increasing influence on our culture, many have voiced their opinions on whether this form of media should be considered an art.
Game Culture
To fully understand the effects of video games on mainstream culture, it is important to understand the development of gaming culture, or the culture surrounding video games. Video games, like books or movies, have avid users who have made this form of media central to their lives. In the early 1970s, programmers got together in groups to play Spacewar!, spending a great deal of time competing in a game that was rudimentary compared to modern games (Brand). As video arcades and home video game consoles gained in popularity, youth culture quickly adapted to this type of media, engaging in competitions to gain high scores and spending hours at the arcade or with the home console.
In the 1980s, an increasing number of kids were spending time on consoles playing games and, more importantly, increasingly identifying with the characters and products associated with the games. Saturday morning cartoons were made out of the Pac-Man and Super Mario Bros. games, and an array of nongame merchandise was sold with video game logos and characters. The public recognition of some of these characters has made them into cultural icons. A poll taken in 2007 found that more Canadians surveyed could identify a photo of Mario, from Super Mario Bros., than a photo of the current Canadian prime minister (Cohn & Toronto, 2007).
As the kids who first played Super Mario Bros. began to outgrow video games, companies such as Sega, and later Sony and Microsoft, began making games to appeal to older demographics. This has increased the average age of video game players, which was 35 in 2009 (Entertainment Software Association, 2009). The Nintendo Wii has even found a new demographic in retirement communities, where Wii Bowling has become a popular form of entertainment for the residents (Wischnowsky). The gradual increase in gaming age has led to an acceptance of video games as an acceptable form of mainstream entertainment.
The Subculture of Geeks
The acceptance of video games in mainstream culture has consequently changed the way that the culture views certain people. “Geek” was the name given to people who were adept at technology but lacking in the skills that tended to make one popular, like fashion sense or athletic ability. Many of these people, because they often did not fare well in society, favored imaginary worlds such as those found in the fantasy and science fiction genres. Video games were appealing because they were both a fantasy world and a means to excel at something. Jim Rossignol, in his 2008 book This Gaming Life: Travels in Three Cities, explained part of the lure of playing Quake III online:
Video games gave a group of excluded people a way to gain proficiency in the social realm. As video games became more of a mainstream phenomenon and video game skills began to be desired by a large number of people, the popular idea of geeks changed. It is now common to see the term “geek” used to mean a person who understands computers and technology. This former slur is also prominent in the media, with headlines in 2010 such as “Geeks in Vogue: Top Ten Cinematic Nerds (Sharp, 2010).”
Many media stories focusing on geeks examine the ways in which this subculture has been accepted by the mainstream. Geeks may have become “cooler,” but mainstream culture has also become “geekier.” The acceptance of geek culture has led to acceptance of geek aesthetics. The mainstreaming of video games has led to acceptance of fantasy or virtual worlds. This is evident in the popularity of film/book series such as The Lord of the Rings and Harry Potter. Comic book characters, emblems of geek culture, have become the vehicles for blockbuster movies such as Spider-Man and The Dark Knight. The idea of a fantasy or virtual world has come to appeal to greater numbers of people. Virtual worlds such as those represented in the Grand Theft Auto and Halo series and online games such as World of Warcraft have expanded the idea of virtual worlds so that they are not mere means of escape but new ways to interact (Konzack, 2006).
The Effects of Video Games on Other Types of Media
Video games during the 1970s and ’80s were often derivatives of other forms of media. E.T., Star Wars, and a number of other games took their cues from movies, television shows, and books. This began to change in the 1980s with the development of cartoons based on video games, and in the 1990s and 2000s with live-action feature films based on video games.
Television
Television programs based on video games were an early phenomenon. Pac-Man, Pole Position, and Q*bert were among the animated programs that aired in the early 1980s. In the later 1980s, shows such as The Super Mario Bros. Super Show! and The Legend of Zelda promoted Nintendo games. In the 1990s, Pokémon, originally a game developed for the Nintendo Game Boy, was turned into a television series, a card game, several movies, and even a musical (Internet Movie Database). Recently, several programs have been developed that revolve entirely around video games—the web series The Guild, for instance, tells the story of a group of friends who interact through an unspecified MMORPG.
Nielsen, the company that tabulates television ratings, has begun rating video games in a similar fashion. In 2010, this information showed that video games, as a whole, could be considered a kind of fifth network, along with the television networks NBC, ABC, CBS, and Fox (Shields, 2009). Advertisers use Nielsen ratings to decide which programs to support. The use of this system is changing public perceptions to include video game playing as a habit similar to television watching.
Video games have also influenced the way that television is produced. The Rocket Racing League, scheduled to be launched in 2011, will feature a “virtual racetrack.” Racing jets will travel along a virtual track that can only be seen by pilots and spectators with enabled equipment. Applications for mobile devices are being developed that will allow spectators to race virtual jets alongside the ones flying in real time (Hadhazy, 2010). This type of innovation is only possible with a public that has come to demand and rely on the kind of interactivity that video games provide.
Film
The rise in film adaptations of video games accompanies the increased age of video game users. In 1995, Mortal Kombat, a live-action movie based on the video game, grossed over $70 million at the box office, placing it 22nd in the rankings for that year (Box Office Mojo). Lara Croft: Tomb Raider, released in 2001, starred well-known actress Angelina Jolie and ranked No. 1 at the box office when it was released, and 15th overall for the year (Box Office Mojo). Films based on video games are an increasingly common sight at the box office, such as producer Jerry Bruckheimer’s Prince of Persia, or the recent sequel to Tron, based on the idea of a virtual gaming arena.
Another aspect of video games’ influence on films is how video game releases are marketed and perceived. The release date for anticipated game Grand Theft Auto IV was announced and marketed to compete with the release of the film Iron Man. Grand Theft Auto IV supposedly beat Iron Man by $300 million in sales. This kind of comparison is, in some ways, misleading. Video games cost much more than a ticket to a movie, so higher sales does not mean that more people bought the game than the movie. Also, the distribution apparatus for the two media is totally different. Movies can only be released in theaters, whereas video games can be sold at any retail outlet (Associated Press, 2008). What this kind of news story proves, however, is that the general public considers video games as something akin to a film. It is also important to realize that the scale of production and profit for video games is similar to that of films. Video games include music scores, actors, and directors in addition to the game designers, and the budgets for major games reflect this. Grand Theft Auto IV cost an estimated $100 million to produce (Bowditch, 2008).
Music
Video games have been accompanied by music ever since the days of the arcade. Video game music was originally limited to computer beeps turned into theme songs. The design of the Nintendo 64, Sega Saturn, and Sony PlayStation made it possible to use sampled audio on new games, meaning songs played on physical instruments could be recorded and used on video games. Beginning with the music of the Final Fantasy series, scored by famed composer Nobuo Uematsu, video game music took on film score quality, complete with full orchestral and vocal tracks. This innovation proved beneficial to the music industry. Well-known musicians such as Trent Reznor, Thomas Dolby, Steve Vai, and Joe Satriani were able to create the soundtracks for popular games, giving these artists exposure to new generations of potential fans (Video Games Music Big Hit, 1997). Composing music for video games has turned into a profitable means of employment for many musicians. Schools such as Berklee College of Music, Yale, and New York University have programs that focus on composing music for video games. The students are taught many of the same principles that are involved in film scoring (Khan, 2010).
Many rock bands have allowed their previously recorded songs to be used in video games, similar to a hit song being used on a movie soundtrack. The bands are paid for the rights to use the song, and their music is exposed to an audience that otherwise might not hear it. As mentioned earlier, games like Rock Band and Guitar Hero have been used to promote bands. The release of The Beatles: Rock Band was timed to coincide with the release of digitally remastered reissues of the Beatles’ albums.
Another phenomenon relating to music and video games involves musicians covering video game music. A number of bands perform only video game covers in a variety of styles, such as the popular Japanese group the Black Mages, which performs rock versions of Final Fantasy music. Playing video game themes is not limited to rock bands, however. An orchestra and chorus called Video Games Live started a tour in 2005 dedicated to playing well-known video game music. Their performances are often accompanied by graphics projected onto a screen showing relevant sequences from the video games (Play Symphony).
Machinima
Recently, the connection between video games and other media has increased with the popularity of machinima, animated films and series created by recording character actions inside video games. Beginning with the short film “Diary of a Camper,” filmed inside the game Quake in 1996, fans of video games have adopted the technique of machinima to tell their own stories. Although these early movies were released only online and targeted a select niche of gamers, professional filmmakers have since adopted the process, using machinima to storyboard scenes and to add a sense of individuality to computer-generated shots. This new form of media is increasingly becoming mainstream, as TV shows such as South Park and channels such as MTV2 have introduced machinima to a larger audience (Strickland).
Video Games and Education
One sign of the mainstreaming of video games is the increase of educational institutions that embrace them. As early as the 1980s, games such as Number Munchers and Word Munchers were designed to help children develop basic math and grammar skills. In 2006, the Federation of American Scientists completed a study that approved of video game use in education. The study cited the fact that video game systems were present in most households, kids favored learning through video games, and games could be used to facilitate analytical skills (Feller, 2006). Another study, published in the science journal Nature in 2002, found that regular video game players had better developed visual-processing skills than people who did not play video games. Participants in the test were asked to play a first-person shooter game for 1 hour a day for 10 days, and were then tested for specific visual attention skills. The playing improved these skills in all participants, but the regular video game players had a greater skill level than the non–game players. According to the study, “Although video-game playing may seem to be rather mindless, it is capable of radically altering visual attention processing (Green & Bavelier, 2003).”
Other educational institutions have begun to embrace video games as well. The Boy Scouts of America have created a “belt loop,” something akin to a merit badge, for tasks including learning to play a parent-approved game and developing a schedule to balance video game time with homework (Murphy, 2010). The federal government has also seen the educational potential of video games. A commission on balancing the federal budget suggested a video game that would educate Americans about the necessary costs of balancing the federal budget (Wolf, 2010). The military has similarly embraced video games as training simulators for new soldiers. These simulators, working off of newer game technologies, present several different realistic options that soldiers could face on the field. The games have also been used as recruiting tools by the U.S. Army and the Army National Guard (Associated Press, 2003).
The ultimate effect of video game use for education, whether in schools or in the public arena, means that video games have been validated by established cultural authorities. Many individuals still resist the idea that video games can be beneficial or have a positive cultural influence, but their embrace by educational institutions has given video games validation.
Video Games as Art
While universally accepted as a form of media, a debate has recently arisen over whether video games can be considered a form of art. Roger Ebert, the well-known film critic, has historically argued that “video games can never be art,” citing the fact that video games are meant to be won, whereas art is meant to be experienced (Ebert, 2010).
His remarks have generated an outcry from both video gamers and developers. Many point to games such as 2009’s Flower, in which players control the flow of flower petals in the wind, as examples of video games developing into art. Flower avoids specific plot and characters to allow the player to focus on interaction with the landscape and the emotion of the game-play (That Game Company). Likewise, more mainstream games such as the popular Katamari series, released in 2004, are built around the idea of creation, requiring players to pull together a massive clump of objects in order to create a star.
Video games, once viewed as a mindless source of entertainment, are now being featured in publications such as The New Yorker magazine and The New York Times (Fisher, 2010). With the development of increasingly complex musical scores and the advent of machinima, the boundaries between video games and other forms of media are slowly blurring. While they may not be considered art by everyone, video games have contributed significantly to modern artistic culture.
Key Takeaways
- The aesthetics and principles of gaming culture have had an increasing effect on mainstream culture. This has led to the gradual acceptance of marginalized social groups and increased comfort with virtual worlds and the pursuit of new means of interaction.
- Video games have gone from being a derivative medium that took its cues from other media, such as books, films, and music, to being a form of media that other types derive new ideas from. Video games have also interacted with older forms of media to change them and create new means of entertainment and interaction.
- Educational institutions have embraced the use of video games as valuable tools for teaching. These tools include simulated worlds in which important life skills can be learned and improved.
- While video games may not be accepted by everyone as a form of art, there is no doubt that they contribute greatly to artistic media such as music and film.
Exercises
Think about the ways in which video games have influenced and affected other forms of media. Then consider the following questions:
- Are there things video games will never be able to offer?
- Write down several examples of ways in which other forms of media are not replicated by video games. Then speculate on ways video games could eventually emulate these forms.
References
Associated Press, “‘Grand Theft Auto IV’ Beats ‘Iron Man’ by $300 Million,” Fox News, May 9, 2008, http://www.foxnews.com/story/0,2933,354711,00.html.
Associated Press, “Military Training Is Just a Game,” Wired, October 3, 2003, http://www.wired.com/gaming/gamingreviews/news/2003/10/60688.
Bowditch, Gillian. “Grand Theft Auto Producer is Godfather of Gaming,” Times (London), April 27, 2008, http://www.timesonline.co.uk/tol/news/uk/scotland/article3821838.ece.
Box Office Mojo, “Lara Croft: Tomb Raider,” http://www.boxofficemojo.com/movies/?id=tombraider.htm.
Box Office Mojo, “Mortal Kombat,” http://boxofficemojo.com/movies/?id=mortalkombat.htm.
Brand, “Space War.”
Cohn & Wolfe Toronto, “Italian Plumber More Memorable Than Harper, Dion,” news release, November 13, 2007, http://www.newswire.ca/en/releases/mmnr/Super_Mario_Galaxy/index.html.
Ebert, Roger. “Video Games Can Never Be Art,” Chicago Sun-Times, April 16, 2010, http://blogs.suntimes.com/ebert/2010/04/video_games_can_never_be_art.html.
Entertainment Software Association, Essential Facts About the Computer and Video Game Industry: 2009 Sales, Demographic, and Usage Data, 2009, http://www.theesa.com/facts/pdfs/ESA_EF_2009.pdf.
Faylor, Chris. “NPD: 72% of U.S. Population Played Games in 2007; PC Named “Driving Force in Online Gaming,” Shack News, April 2, 2008, http://www.shacknews.com/onearticle.x/52025.
Feller, Ben. “Group: Video Games Can Reshape Education,” MSNBC, October 18, 2006, http://www.msnbc.msn.com/id/15309615/from/ET/.
Fisher, Max. “Are Video Games Art?” Atlantic Wire, April 19, 2010, http://www.theatlanticwire.com/features/view/feature/Are-Video-Games-Art-1085/.
Green, C. Shawn. and Daphne Bavelier, “Action Video Game Modifies Visual Selective Attention,” Nature 423, no. 6939 (2003): 534–537.
Hadhazy, Adam. “’NASCAR of the Skies’ to Feature Video Game-Like Interactivity,” TechNewsDaily, April 26, 2010, http://www.technewsdaily.com/nascar-of-the-skies-to-feature-video-game-like-interactivity–0475/.
Internet Movie Database, “Pokémon,” http://www.imdb.com/.
Khan, Joseph P. “Berklee is Teaching Its Students to Compose Scores for Video Games,” Boston Globe, January 19, 2010, http://www.boston.com/news/education/higher/articles/2010/01/19/berklee_is_teaching_students_to_compose_scores_for_video_games/.
Konzack, Lars. “Geek Culture: The 3rd Counter-Culture,” (paper, FNG2006, Preston, England, June 26–28, 2006), http://www.scribd.com/doc/270364/Geek-Culture-The-3rd-CounterCulture.
Murphy, David. “Boy Scouts Develop ‘Vide Game’ Merit Badge,” PC Magazine, May 2, 2010, http://www.pcmag.com/article2/0,2817,2363331,00.asp.
Play Symphony, Jason Michael Paul Productions, “About,” Play! A Video Game Symphony, http://www.play-symphony.com/about.php.
Rossignol, Jim. This Gaming Life: Travels in Three Cities (Ann Arbor, MI: University of Michigan Press, 2008), 17.
Sharp, Craig. “Geeks in Vogue: Top Ten Cinematic Nerds,” Film Shaft, April 26, 2010, http://www.filmshaft.com/geeks-in-vogue-top-ten-cinematic-nerds/.
Shields, Mike. “Nielsen: Video Games Approach 5th Network Status,” Adweek, March 25, 2009, http://www.adweek.com/aw/content_display/news/agency/e3i4f087b1aeac6f008d0ecadfeffe4a191.
Strickland, Jonathan. “How Machinima Works,” HowStuffWorks.com, http://entertainment.howstuffworks.com/machinima3.htm.
That Game Company, “Flower,” http://thatgamecompany.com/games/flower/.
Video Games Music Big Hit, “Video Games Music Big Hit,” Wilmington (NC) Morning Star, February 1, 1997, 36.
Wischnowsky, “Wii Bowling.”
Wolf, Richard. “Nation’s Soaring Deficit Calls for Painful Choices,” USA Today, April 14, 2010, http://www.usatoday.com/news/washington/2010-04-12-deficit_N.htm.
10.5 Controversial Issues
Learning Objectives
- Describe controversial issues related to modern video games.
- Analyze the issues and problems with rating electronic entertainment.
- Discuss the effects of video game addiction.
- Examine the gender issues surrounding video games.
The increasing realism and expanded possibilities of video games has inspired a great deal of controversy. However, even early games, though rudimentary and seemingly laughable nowadays, raised controversy over their depiction of adult themes. Although increased realism and graphics capabilities of contemporary video games have increased the shock value of in-game violence, international culture has been struggling to come to terms with video game violence since the dawn of video games.
Violence
Violence in video games has been controversial from their earliest days. Death Race, an arcade game released in 1976, encouraged drivers to run over stick figures, which then turned into Xs. Although the programmers claimed that the stick figures were not human, the game was controversial, making national news on the TV talk show Donahue and the TV news magazine 60 Minutes. Video games, regardless of their realism or lack thereof, had added a new potential to the world of games and entertainment: the ability to simulate murder.
The enhanced realism of video games in the 1990s accompanied a rise in violent games as companies expanded the market to target older demographics. A great deal of controversy exists over the influence of this kind of violence on children, and also over the rating system that is applied to video games. There are many stories of real-life violent acts involving video games. The 1999 Columbine High School massacre was quickly linked to the teenage perpetrators’ enthusiasm for video games. The families of Columbine victims brought a lawsuit against 25 video game companies, claiming that if the games had not existed, the massacre would not have happened (Ward, 2001). In 2008, a 17-year-old boy shot his parents after they took away his video game system, killing his mother (Harvey, 2009). Also in 2008, when six teens were arrested for attempted carjacking and robbery, they stated that they were reenacting scenes from Grand Theft Auto (Cochran, 2008).
There is no shortage of news stories that involve young men committing crimes relating to an obsession with video games. The controversy has not been resolved regarding the influences behind these crimes. Many studies have linked aggression to video games; however, critics take issue with using the results of these studies to claim that the video games caused the aggression. They point out that people who enact video-game–related crimes already have psychopathic tendencies, and that the results of such research studies are correlational rather than causational—a naturally violent person is drawn to play violent video games (Adams, 2010). Other critics point out that violent games are designed for adults, just as violent movies are, and that parents should enforce stricter standards for their children.
The problem of children’s access to violent games is a large and complex one. Video games present difficult issues for those who create the ratings. One problem is the inconsistency that seems to exist in rating video games and movies. Movies with violence or sexual themes are rated either R or NC-17. Filmmakers prefer the R rating over the NC-17 rating because NC-17 ratings hurt box office sales, and they will often heavily edit films to remove overly graphic content. The Entertainment Software Rating Board (ESRB), rates video games. The two most restrictive ratings the ESRB has put forth are “M” (for Mature; 17 and older; “may contain mature sexual themes, more intense violence, and/or strong language”) and “AO” (for Adults Only; 18 and up; “may include graphic depictions of sex and/or violence”). If this rating system were applied to movies, a great deal of movies now rated R would be labeled AO. An AO label can have a devastating effect on game sales; in fact, many retail outlets will not sell games with an AO rating (Hyman, 2005). This creates a situation where a video game with a sexual or violent scene as graphic as the ones seen in R-rated movies is difficult to purchase, whereas a pornographic magazine can be bought at many convenience stores. This issue reveals a unique aspect of video games. Although many of them are designed for adults, the distribution system and culture surrounding video games is still largely youth-oriented.
Video Game Addiction
Another controversial issue is the problem of video game addiction. As of the print date, the American Medical Association (AMA) has not created an official diagnosis of video game addiction, citing the lack of long-term research. However, the AMA uses the term “video game overuse” to describe video game use that begins to affect other aspects of an individual’s life, such as relationships and health. Studies have found that socially marginalized people have more of a tendency to overuse games, especially online role-playing games like World of Warcraft. Other studies have found that patterns of time usage and social dysfunction in players who overuse games are similar to those of other addictive disorders (Khan, 2007).
Groups such as Online Gamers Anonymous have developed a 12-step program similar to that of Alcoholics Anonymous to help gamers deal with problems relating to game overuse. This group is run by former online gamers and family members of those affected by heavy game use (On-line Gamers Anonymous). This problem is not new, but it has become more prevalent. In the early 1990s, many stories surfaced of individuals dropping out of college or getting divorced because of addiction to MUDs (Greene, 1998). In addition, heavy video gaming, much like heavy computer use in an office setting, can result in painful repetitive stress injuries. Even worse are the rare, but serious, cases of death resulting from video game overuse. In the early ’80s, two deaths were linked to the video game Berzerk. The players, both in their late teens, suffered fatal heart attacks while struggling to achieve top scores (Arcade History). The issue of video game addiction has become a larger one because of the ubiquity of video games and Internet technology. In countries that have a heavily wired infrastructure, such as South Korea, the problem is even bigger. In 2010, excessive game use was problematic enough that the South Korean government imposed an online gaming curfew for people under the age of 18 that would block certain sites after midnight. This decision followed the death of a 3-month-old baby from starvation while her parents played an online game at an Internet café (Cain, 2010).
Another side of video game addiction is told well by Jim Rossignol in his book This Gaming Life: Travels in Three Cities. The book describes Rossignol’s job as a journalist for a financial company and his increasing involvement with Quake III. Rossignol trained a team of players to compete in virtual online tournaments, scheduling practices and spending the hours afterward analyzing strategies with his teammates. His intense involvement in the game led to poor performance at his job, and he was eventually fired. After being fired, he spent even more time on the game, not caring about his lack of a job or shrinking savings. The story up to this point sounds like a testimonial about the dangers of game addiction. However, because of his expertise in the game, he was hired by a games magazine and enjoyed full-time employment writing about what he loved doing. Rossignol does not gloss over the fact that games can have a negative influence, but his book speaks to the ways in which gaming—often what would be described as obsessive gaming—can cause positive change in people’s lives (Rossignol).
Sexism
It is no secret that young adult men make up the majority of video gamers. A study in 2009 found that 60 percent of gamers were male, and the average age of players was 35 (Entertainment Software Association, 2009). While the gender gap has certainly narrowed in the past 30 years, video gaming is still in many ways a male-dominated medium.
Male influence can be seen throughout the industry. Women make up less than 12 percent of game designers and programmers, and those who do enter the field often find themselves facing subtle—and not so subtle—sexism (Media Awareness Network). When Game Developer magazine released its list of top 50 people in the video game industry for 2010, bloggers were quick to note that no female developers appeared in the list (Doctorow, 2010). In 2007, scandal erupted over a pornographic comic featuring Jade Raymond, current managing director of the French video game publisher Ubisoft, that surfaced on an online forum. The motivation behind the comic was to allege that Raymond did not deserve her position at Ubisoft because she earned it based on her looks, rather than on her abilities and experience.
Sexism in video games has existed since the early days of the medium. The plot of the infamous Custer’s Revenge, released for the Atari 2600 in 1982, centered on the rape of a Native American woman. Popular NES games such as Super Mario Bros. and The Legend of Zelda featured a male figure rescuing a damsel in distress. Both the protagonist and antagonist in the original Tomb Raider game had hourglass figures with prominent busts and nonexistent waists, a trend that continues in the franchise today. In 2003, the fighting series Dead or Alive released a spin-off game titled Dead or Alive Xtreme Beach Volleyball that existed to showcase the well-endowed female characters in swim attire (Strauss, 2010). The spin-off was so popular that two more similar games were released.
Some note that video games are not unique in their demeaning portrayal of women. Like movies, television, and other media forms, video games often fall back on gender stereotyping in order to engage consumers. Defenders point out that many male video game characters are also depicted lewdly. Games such as God of War and Mortal Kombat feature hypersexualized men with bulging muscles and aggressive personalities who rely on their brawn rather than their brains. How are men affected by these stereotypes? Laboratory studies have shown that violence and aggression in video games affect men more than women, leading to higher levels of male aggression (Bartholow & Anderson, 2002). While sexism is certainly present in video games, it seems sexual stereotyping affects both genders negatively.
In recent years, game designers have sought to move away from these clichéd representations of gender. The popular game Portal, released in 2007, features a female protagonist clad in a simple orange jumpsuit who relies on her wits to solve logic puzzles. Series such as Half-Life and Phoenix Wright: Ace Attorney star male heroes who are intellectuals instead of warriors. Other games, like the Mass Effect series and Halo: Reach, allow gamers to choose the gender of the main character without altering elements of the plot or game-play. However, despite recent strides forward, there is no doubt that as the video game industry continues to evolve, so too will the issue of gender.
Key Takeaways
- Video game violence has been an issue since the 1976 game Death Race. The potential of video games to simulate murder created a new issue in entertainment media.
- A great number of news stories link video games with violent crimes. Studies have found a correlation between aggressive behavior and video games, but critics claim that these studies do not prove that video games cause violent acts.
- The video game rating system informs purchasers about the content of a game. The highest rating, Adults Only, hurts video game sales, so companies try to make games that are rated Mature. Critics charge that video game ratings are inconsistent with other schemes, such as the movie rating system.
- Video game addiction is associated more with online games, although many instances of single-player obsessions exist as well. It has become a high-profile issue with the rise in popularity of online gaming.
- The American Medical Association has not developed a diagnosis for video game addiction. Instead, it uses the term “video game overuse” to describe a state where an individual’s gaming habits have a detrimental effect on his or her personal life.
- Sexism and gender issues are hot topics in the video game industry, as female gamers and developers often struggle for equal footing. Controversy has arisen over character stereotypes; however, modern games are beginning to break through conventional barriers of gender.
Exercises
Choose video game violence or video game addiction and search for it on the Internet. Examine the research that is associated with the issue you chose. Then consider the following questions:
- What is your opinion of the issue in light of the research?
- Has your issue been researched thoroughly?
- Visit message boards and forums and search for journal articles related to video game addiction or violence. What actions would you suggest to research these issues more fully?
References
Adams, Jill U. “A Closer Look: Effects of Violent Video Games,” Los Angeles Times, May 3, 2010, http://www.latimes.com/news/health/la-he-closer-20100503,0,5586471.story.
Arcade History, “Berzerk, the Video Game,” http://www.arcade-history.com/?n=berzerk&page=detail&id=236.
Bartholow, Bruce D. and Craig A. Anderson, “Effects of Violent Video Games on Aggressive Behavior: Potential Sex Differences,” Journal of Experimental Social Psychology 38, no. 3 (2002): 283–290.
Cain, Geoffrey. “South Korea Cracks Down on Gaming Addiction,” Time, April 20, 2010, http://www.time.com/time/world/article/0,8599,1983234,00.html.
Cochran, Lee. “Teens Say: Video Game Made Them Do It,” ABC News, June 27, 2008, http://abcnews.go.com/TheLaw/story?id=5262689.
Doctorow, Cory. “Gamasutra’s Most Important Gamers List Is a Boy’s Club,” BoingBoing (blog), April 14, 2010, http://boingboing.net/2010/04/14/gamasutras-most-impo.html.
Entertainment Software Association, Essential Facts About the Computer and Video Game Industry: 2009.
Greene, R. W. “Is Internet Addiction for Worrywarts or a Genuine Problem?” CNN, September 23, 1998, http://www.cnn.com/TECH/computing/9809/23/netaddict.idg/index.html.
Harvey, Mike. “Teenager Daniel Petric Shot Parents Who Took Away Xbox,” Times (London), January 13, 2009, http://www.timesonline.co.uk/tol/news/world/us_and_americas/article5512446.ece.
Hyman, Paul. “Video Game Rating Board Don’t Get No Respect,” Hollywood Reporter, April 8, 2005, http://www.hollywoodreporter.com/hr/search/article_display.jsp?vnu_content_id=1000874859.
Khan, Mohamed. Emotional and Behavioral Effects of Video Games and Internet Overuse, American Medical Association, Council on Science and Public Health, 2007, http://www.ama-assn.org/ama1/pub/upload/mm/443/csaph12a07-fulltext.pdf.
Media Awareness Network, “Gender Stereotyping,” http://www.media-awareness.ca/english/parents/video_games/concerns/gender_videogames.cfm.
On-line Gamers Anonymous, “About OLGA & OLG-Anon,” http://www.olganon.org.
Rossignol, This Gaming Life, 4–11.
Strauss, Michael. “A Look at Female Characters in Video Games,” Associated Content, July 16, 2010, http://www.associatedcontent.com/article/5487226/a_look_at_female_characters_in_video_pg2.html?cat=19.
Ward, Mark. “Columbine Families Sue Computer Game Makers,” BBC News, May 1, 2001, http://news.bbc.co.uk/2/hi/science/nature/1295920.stm.
10.6 Blurring the Boundaries Between Video Games, Information, Entertainment, and Communication
Learning Objectives
- Identify how video games allow previously marginalized groups to enjoy new social experiences.
- Explain how virtual worlds can be used to facilitate learning.
- Describe the different ways video games have been used to communicate messages.
Early video games were easily identifiable as games. They had basic goals and set rules of play. As games have developed in complexity, they have allowed for newer options and an expanded idea of what constitutes a game. At the same time, the aesthetics and design of game culture have been applied to tools and entertainment phenomena that are not technically games. The result is a blurring of the lines between games and other forms of communication, and the creation of new means of mixing information, entertainment, and communication.
Video Games and the Social World of Sports
Video games have developed in complexity and flexibility to the point that they are providing new kinds of social interactions. This is perhaps clearest when games are compared to physical sports. Most of the abstract reasons for involvement in sports—things like teamwork, problem solving, and leadership—are also reasons to play video games that allow online team interaction. Sports are often portrayed as a social platform where people can learn new skills. Amateur sports teams provide an important means of socialization and communication for many people, allowing individuals from different gender, ethnic, and class backgrounds to find a common forum for communication. Video games have similar capacities; players who engage in online team battles must communicate with one another effectively and learn to collectively solve problems. In fact, video games could arguably provide a more inclusive platform for socializing because they do not exclude most socially or physically challenged individuals. The social platform of online games is certainly no utopia, as competitive endeavors of any sort inevitably create opportunities for negative communication, but it allows people of different genders, races, and nationalities to play on equal footing, a feat that is difficult, if not impossible, in the physical realm.
Virtual Worlds and Societal Interaction
Massively multiplayer online role-playing games (MMORPGs) such as World of Warcraft, EverQuest, and EVE Online allow for an unprecedented variety of goals to be pursued. According to one study on online communities,
that means games are no longer meant to be a solitary activity played by a single individual. Instead, the player is expected to join a virtual community that is parallel with the physical world, in which societal, cultural, and economical systems arise (Zaphiris, et. al., 2008).
MMORPGs can function as a kind of hybrid of social media and video games in this respect. In some instances, these games could function entirely as social media, and not as games. Consider a player who has no desire to achieve any of the prescribed goals of a game, such as going on quests or accomplishing tasks. This individual could simply walk around in social areas and talk to new people. Is the player still playing a game at this point, or has the game turned into a type of social media?
Virtual worlds such as Second Life are good examples of the thin line between games and social media. In this virtual world, users create avatars to represent themselves. They can then explore the world and take part in the culture that characterizes it. The Second Life website FAQ (frequently asked questions) tries to illustrate the differences between a virtual world and a video game:
While the Second Life interface and display are similar to most popular massively multiplayer online role playing games (or MMORPGs), there are two key, unique differences:
Creativity: The Second Life virtual world provides almost unlimited freedom to its Residents. This world really is whatever you make it. If you want to hang out with your friends in a garden or nightclub, you can. If you want to go shopping or fight dragons, you can. If you want to start a business, create a game or build a skyscraper you can. It’s up to you.
Ownership: Instead of paying a monthly subscription fee, Residents can start a Basic account for FREE. Additional Basic accounts cost a one-time flat fee of just $9.95. If you choose to get land to live, work and build on, you pay a monthly lease fee based on the amount of land you have. You also own anything you create—Residents retain intellectual property rights over their in-world creations (Second Life).
Virtual worlds may differ from traditional video games in key ways, but these differences will likely be further blurred as games develop. Regardless of the way these worlds are classified, they are based heavily on the aesthetics and design of video games.
Virtual worlds are societies in which communication can be expanded to relay information in ways that cannot be done with other forms of media. Universities have set up virtual classrooms in Second Life that employ techniques for teaching that are otherwise impossible. Curriculum can be set up to take advantage of the virtual world’s creative capacity (Kluge & Riley, 2008). For example, a class on architecture could lead students through a three-dimensional recreation of a Gothic cathedral, a history class could rely on an interactive battlefield simulation to explain the mechanics of a famous battle, or a physics class could use simulations to demonstrate important principles.
Virtual worlds have been used to relay new kinds of information as well. In 2007, specialists developed virtual-reality simulations to help patients with Asperger’s syndrome navigate through social situations. Individuals with Asperger’s syndrome have difficulty recognizing nonverbal cues and adapting to change. The simulations allowed these patients to create avatars that then played out social situations such as job interviews. The simulations could be adjusted to allow for more complicated interactions, and the users could replay their own performances to better understand the simulation (University of Texas at Dallas News Center, 2008). The creation of avatars and the ability to replay scenes until the user improves performance is a direct influence of video games.
Social Media and Games
Social media outlets such as Facebook have also used video games to expand their communication platforms. FarmVille, a game in which players plant, harvest, and sell crops and expand their farm plots with the profits, is integrated with Facebook so that all users can connect with their real-life friends in the game. The game adds another aspect of communication—one centered on competition, strategy, and the aesthetics of game-play—to the Facebook platform. In 2010, FarmVille had 80 million users, an unprecedented number of people playing a single game. Other games for Facebook include Lexulous (formerly Scrabulous), a game that mimicked the board game Scrabble closely enough that it had to be changed, and Pet Society, a game that allows users to raise virtual animals as pets. These games are unique in their demographic scope; they seek to engage literally anyone. By coupling games with social media sites, game designers have pushed the influence of video games to unprecedented levels (Walker, 2010).
Due to the increasing popularity of video games, many social networking websites have emerged in recent years that target hard-core gamers and casual gamers alike. The website GamingPassions.com describes itself as a “free online dating and social networking site for the Video Gaming community” that “allows you to meet other video game lovers who ‘get it (Gaming Passions).’” Others, such as DateCraft, target players who share a love of specific games, such as World of Warcraft. These websites allow players to socialize outside of the confines of a video game and provide a way for fans of any game to connect (Fawkes, 2010).
Mobile Phones and Gaming
The maker of FarmVille, a company called Zynga, has released a number of social networking games that can be played via Internet-enabled mobile phones. Mafia Wars, a simulated version of organized crime that originated as an online game, lets users manage their money and complete jobs from their mobile phones. This innovation is only a change of platform from the computer to the phone, but it promises to make gaming a common means of social interaction.
Phones are used to communicate directly with another person, whether through text messages, Facebook posts, or phone calls. Adding games to phones creates a new language that people can use to communicate with each other. One can imagine the ways in which a spat between friends or a couple hooking up could be communicated through this medium. During the week of Valentine’s Day in 2010, the creators of FarmVille reported that users sent more than 500 million copies of virtual gifts to each other (Ingram, 2010). Competition, aggression, hostility, as well as generosity and general friendliness are given new means of expression with the addition of video games to the mobile platform.
Video Games and Their Messages
An important genre of games is often overlooked when focusing on the overall history and effect of video games. These games are created to get a specific point across to the player, and are generally low-budget efforts by individuals or small groups. The group Molleindustria puts out a host of these games, including Oligarchy, a game in which the player is an oil-drilling business executive who pursues corruption to gain profits, or Faith Fighter, in which religious icons fight each other (Industria). Downing Street Fighter is a similar game, designed as a satirical representation of the United Kingdom’s 2010 elections (Verjee, 2010).
Other games in this genre include a great deal of content that informs players about current events. One example is Cutthroat Capitalism, where users play the part of Somali pirates who are carrying out kidnapping and piracy missions. The player must weigh the risks and rewards of their exploits given a variety of economic factors (Carney, 2009). This game allows players to understand the forces at work behind international phenomena such as piracy. Other games include the Redistricting Game, a game aimed at encouraging congressional redistricting reform, and Planet Green Game, in which players find ways to conserve energy (McGervan, 2007). These games’ publishers have used their medium to transmit information in new ways. Political satire in the medium of the video game allows the player to, in effect, play a satirical joke to its logical conclusion, exploring the entire scope of the joke. Advocacy games engage players with an interactive political message, rather than sending out traditional media ads. Mainstream groups such as PETA have also turned to video games to convey messages. During the 2008 Thanksgiving season, the animal-rights group released a parody of the popular Cooking Mama games to demonstrate the cruelty of eating meat (Sliwinski, 2008).
Although games of this type are not in a league with game series like Halo—either in design or in popularity—they are pushing the boundaries of the video game as a form of media. Political pamphlets were a major source of information and political discourse during the American Revolution. Video games may well come to serve a similar informational function.
Key Takeaways
- Team-oriented video games have allowed new groups to learn skills similar to those learned in physical sports. Team building, competition, and problem solving are all major components of playing on a virtual team.
- Virtual worlds, though not video games in themselves, use the design and principles of video games to create new social spaces. These spaces are being used to create new social platforms and new ways of communicating complex ideas.
- Video games have been added to social networking sites to expand the ways in which people can communicate with each other. The medium of the game allows players to interact with their friends in new ways.
Exercises
Several examples of the ways in which virtual worlds could be used for education were mentioned in this section.
- Think of ways in which this course could be taught in a virtual world. What advantages would this kind of teaching bring to a course like this?
- What disadvantages does it carry with it?
End-of-Chapter Assessment
Review Questions
-
Section 1
- Identify the companies that headed the video game industry in the 1970s, ’80s, ’90s, and 2000s.
- Name three important innovations in video games that appealed to new audiences.
- Name three technologies that were fundamental to the evolution of video games.
- Explain the importance of the Internet in the evolution of video games.
-
Section 2
- Name three of the games listed in this section and explain how they were innovative.
- Explain how World of Warcraft expanded the video game as a form of media.
- How are the latest trends in video games expanding the video game?
- Compare music releases on Guitar Hero or Rock Band with traditional music releases such as MP3s or physical albums. What are the advantages and disadvantages of these forms?
-
Section 3
- Describe some of the aesthetics and ideas of gaming culture.
- Pick a type of media, such as film or books, and analyze the ways that video games have affected it.
- Describe one response of mainstream culture to the influence of video games.
-
Section 4
- Summarize the points in the public debate over video game violence.
- When do you think video game use becomes an addiction?
- Name one way that the controversy over video game violence could be addressed.
- Explain the ways in which video game and movie ratings differ.
-
Section 5
- Give an example of the ways in which video games have allowed people to experience new things.
- Give an example of the ways in which virtual worlds can be used in education.
- Compare games that advocate political messages with other forms of political persuasion.
Critical Thinking Questions
- How has youth culture been influenced and even manipulated by the makers of video games?
- How have video games influenced your communication or learning?
- Should video games be rated in a different way than movies? Explain your answer.
- What are the limitations of video games when compared to other forms of media?
- Do you think the overall influence of video games on culture has been positive or negative? Explain your answer.
Career Connection
Video games are a growing industry, and the budgets to create them are increasing every year. Video games require large production teams. The following jobs are important aspects of video game production:
- Game developer
- Art director
- Programmer
- Sound designer
- Producer
- Game tester
- Animator
Choose one of the jobs listed here or find a different job associated with the games industry and research the requirements for it online. When you have researched your job, answer the following questions:
- What made you choose this job?
- What strengths do you have that would help you excel at this job?
- List the steps that you would take if you were to pursue this job.
References
Carney, Scott. “An Economic Analysis of the Somali Pirate Business Model,” Wired, July 13, 2009, http://www.wired.com/politics/security/magazine/17-07/ff_somali_pirates.
Fawkes, Guy. “Online Dating for Video Gamers,” Associated Content, March 19, 2010, http://www.associatedcontent.com/article/2806686/online_dating_for_video_game_players.html?cat=41.
Gaming Passions, “Welcome to Gaming Passions,” http://www.gamingpassions.com/.
Industria, Molle. http://www.molleindustria.org/en/home.
Ingram, Mathew. “Farmville Users Send 500M Valentines in 48 Hours,” GigaOM, February 10, 2010, http://gigaom.com/2010/02/10/farmville-users-send-500m-valentines-in-48-hours/.
Kluge, Stacy. and Liz Riley, “Teaching in Virtual Worlds: Issues and Challenges,” Issues in Informing Science and Information Technology 5 (2008): 128.
McGeveran, William. “Video Games With a Message,” Info/Law (blog), June 18, 2007, http://blogs.law.harvard.edu/infolaw/2007/06/18/video-games-with-a-message/.
Second Life, “Frequently Asked Questions: Is Second Life a game?” http://secondlife.com/whatis/faq.php#02.
Sliwinski, Alexander. “PETA Parody Grills Cooking Mama,” Joystiq, November 17 2008, http://www.joystiq.com/2008/11/17/peta-parody-grills-cooking-mama/.
University of Texas at Dallas News Center, “Avatars Help Asperger Syndrome Patients Learn to Play the Game of Life,” news release, University of Texas at Dallas News Center, November 18, 2007, http://www.utdallas.edu/news/2007/11/18-003.html.
Verjee, Zain. “Will There Be a Knockout Blow?” CNN, May 3, 2010, http://ukelection.blogs.cnn.com/2010/05/03/will-there-be-a-knock-out-blow/.
Walker, Tim. “Welcome to FarmVille: Population 80 million,” Independent (London), February 22, 2010, http://www.independent.co.uk/life-style/gadgets-and-tech/features/welcome-to-farmville-population-80-million-1906260.html.
Zaphiris, Panayiotis, Chee Siang Ang, and Andrew Laghos, “Online Communities,” in The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, ed. Andrew Sears and Julie Jacko (New York: Taylor & Francis Group, 2008), 607.
Chapter 11: The Internet and Social Media
11.1 The Internet and Social Media
11.2 The Evolution of the Internet
11.3 Social Media and Web 2.0
11.4 The Effects of the Internet and Globalization on Popular Culture and Interpersonal Communication
11.5 Issues and Trends
11.1 The Internet and Social Media
Cleaning Up Your Online Act
It used to be that applying for a job was fairly simple: send in a resume, write a cover letter, and call a few references to make sure they will say positive things. The hiring manager understands that this is a biased view, designed to make the applicant look good, but that is all forgivable. After all, everyone applying for a particular job is going through this same process, and barring great disasters, the chances of something particularly negative reaching the desk of a hiring manager are not that great.
However, there is a new step that is now an integral part of this application process—hiding (or at least cleaning up) the applicants’ virtual selves. This could entail “Googling”—shorthand for searching on Google—their own name to see the search results. If the first thing that comes up is a Flickr album (an online photo album from the photo-sharing site Flickr) from last month’s Olympian-themed cocktail party, it may be a good idea to make that album private to ensure that only friends can view the album.
The ubiquity of Web 2.0 social media like Facebook and Twitter allows anyone to easily start developing an online persona from as early as birth (depending on the openness of one’s parents)—and although this online persona may not accurately reflect the individual, it may be one of the first things a stranger sees. Those online photos may not look bad to friends and family, but one’s online persona may be a hiring manager’s first impression of a prospective employee. Someone in charge of hiring could search the Internet for information on potential new hires even before calling references.
First impressions are an important thing to keep in mind when making an online persona professionally acceptable. Your presence online can be the equivalent of your first words to a brand-new acquaintance. Instead of showing a complete stranger your pictures from a recent party, it might be a better idea to hide those pictures and replace them with a well-written blog—or a professional-looking website.
The content on social networking sites like Facebook, where people use the Internet to meet new people and maintain old friendships, is nearly indestructible and may not actually belong to the user. In 2008, as Facebook was quickly gaining momentum, The New York Times ran an article, “How Sticky Is Membership on Facebook? Just Try Breaking Free”—a title that seems at once like a warning and a big-brother taunt. The website does allow the option of deactivating one’s account, but “Facebook servers keep copies of the information in those accounts indefinitely (Aspan, 2008).” It is a double-edged sword: On one hand, users who become disillusioned and quit Facebook can come back at any time and resume their activity; on the other, one’s information is never fully deleted. If a job application might be compromised by the presence of a Facebook profile, clearing the slate is possible, albeit with some hard labor. The user must delete, item by item, every individual wall post, every group membership, every photo, and everything else.
Not all social networks are like this—MySpace and Friendster still require users who want to delete their accounts to confirm this several times, but they offer a clear-cut “delete” option—but the sticky nature of Facebook information is nothing new (Aspan, 2008). Google even keeps a cache of deleted web pages, and the Internet Archive keeps decades-old historical records. This transition from ephemeral media—TV and radio, practically over as quickly as they are broadcast—to the enduring permanence of the Internet may seem strange, but in some ways it is built into the very structure of the system. Understanding how the Internet was conceived may help elucidate the ways in which the Internet functions today—from the difficulties of deleting an online persona to the speedy and near-universal access to the world’s information.
References
Aspan, Maria. “How Sticky Is Membership on Facebook? Just Try Breaking Free,” New York Times, February 11, 2008, http://www.nytimes.com/2008/02/11/technology/11facebook.html.
11.2 The Evolution of the Internet
Learning Objectives
- Define protocol and decentralization as they relate to the early Internet.
- Identify technologies that made the Internet accessible.
- Explain the causes and effects of the dot-com boom and crash.
From its early days as a military-only network to its current status as one of the developed world’s primary sources of information and communication, the Internet has come a long way in a short period of time. Yet there are a few elements that have stayed constant and that provide a coherent thread for examining the origins of the now-pervasive medium. The first is the persistence of the Internet—its Cold War beginnings necessarily influencing its design as a decentralized, indestructible communication network.
The second element is the development of rules of communication for computers that enable the machines to turn raw data into useful information. These rules, or protocols, have been developed through consensus by computer scientists to facilitate and control online communication and have shaped the way the Internet works. Facebook is a simple example of a protocol: Users can easily communicate with one another, but only through acceptance of protocols that include wall posts, comments, and messages. Facebook’s protocols make communication possible and control that communication.
These two elements connect the Internet’s origins to its present-day incarnation. Keeping them in mind as you read will help you comprehend the history of the Internet, from the Cold War to the Facebook era.
The History of the Internet
The near indestructibility of information on the Internet derives from a military principle used in secure voice transmission: decentralization. In the early 1970s, the RAND Corporation developed a technology (later called “packet switching”) that allowed users to send secure voice messages. In contrast to a system known as the hub-and-spoke model, where the telephone operator (the “hub”) would patch two people (the “spokes”) through directly, this new system allowed for a voice message to be sent through an entire network, or web, of carrier lines, without the need to travel through a central hub, allowing for many different possible paths to the destination.
During the Cold War, the U.S. military was concerned about a nuclear attack destroying the hub in its hub-and-spoke model; with this new web-like model, a secure voice transmission would be more likely to endure a large-scale attack. A web of data pathways would still be able to transmit secure voice “packets,” even if a few of the nodes—places where the web of connections intersected—were destroyed. Only through the destruction of all the nodes in the web could the data traveling along it be completely wiped out—an unlikely event in the case of a highly decentralized network.
This decentralized network could only function through common communication protocols. Just as we use certain protocols when communicating over a telephone—“hello,” “goodbye,” and “hold on for a minute” are three examples—any sort of machine-to-machine communication must also use protocols. These protocols constitute a shared language enabling computers to understand each other clearly and easily.
The Building Blocks of the Internet
In 1973, the U.S. Defense Advanced Research Projects Agency (DARPA) began research on protocols to allow computers to communicate over a distributed network. This work paralleled work done by the RAND Corporation, particularly in the realm of a web-based network model of communication. Instead of using electronic signals to send an unending stream of ones and zeros over a line (the equivalent of a direct voice connection), DARPA used this new packet-switching technology to send small bundles of data. This way, a message that would have been an unbroken stream of binary data—extremely vulnerable to errors and corruption—could be packaged as only a few hundred numbers.
Imagine a telephone conversation in which any static in the signal would make the message incomprehensible. Whereas humans can infer meaning from “Meet me [static] the restaurant at 8:30” (we replace the static with the word at), computers do not necessarily have that logical linguistic capability. To a computer, this constant stream of data is incomplete—or “corrupted,” in technological terminology—and confusing. Considering the susceptibility of electronic communication to noise or other forms of disruption, it would seem like computer-to-computer transmission would be nearly impossible.
However, the packets in this packet-switching technology have something that allows the receiving computer to make sure the packet has arrived uncorrupted. Because of this new technology and the shared protocols that made computer-to-computer transmission possible, a single large message could be broken into many pieces and sent through an entire web of connections, speeding up transmission and making that transmission more secure.
One of the necessary parts of a network is a host. A host is a physical node that is directly connected to the Internet and “directs traffic” by routing packets of data to and from other computers connected to it. In a normal network, a specific computer is usually not directly connected to the Internet; it is connected through a host. A host in this case is identified by an Internet protocol, or IP, address (a concept that is explained in greater detail later). Each unique IP address refers to a single location on the global Internet, but that IP address can serve as a gateway for many different computers. For example, a college campus may have one global IP address for all of its students’ computers, and each student’s computer might then have its own local IP address on the school’s network. This nested structure allows billions of different global hosts, each with any number of computers connected within their internal networks. Think of a campus postal system: All students share the same global address (1000 College Drive, Anywhere, VT 08759, for example), but they each have an internal mailbox within that system.
The early Internet was called ARPANET, after the U.S. Advanced Research Projects Agency (which added “Defense” to its name and became DARPA in 1973), and consisted of just four hosts: UCLA, Stanford, UC Santa Barbara, and the University of Utah. Now there are over half a million hosts, and each of those hosts likely serves thousands of people (Central Intelligence Agency). Each host uses protocols to connect to an ever-growing network of computers. Because of this, the Internet does not exist in any one place in particular; rather, it is the name we give to the huge network of interconnected computers that collectively form the entity that we think of as the Internet. The Internet is not a physical structure; it is the protocols that make this communication possible.
One of the other core components of the Internet is the Transmission Control Protocol (TCP) gateway. Proposed in a 1974 paper, the TCP gateway acts “like a postal service (Cerf, et. al., 1974).” Without knowing a specific physical address, any computer on the network can ask for the owner of any IP address, and the TCP gateway will consult its directory of IP address listings to determine exactly which computer the requester is trying to contact. The development of this technology was an essential building block in the interlinking of networks, as computers could now communicate with each other without knowing the specific address of a recipient; the TCP gateway would figure it all out. In addition, the TCP gateway checks for errors and ensures that data reaches its destination uncorrupted. Today, this combination of TCP gateways and IP addresses is called TCP/IP and is essentially a worldwide phone book for every host on the Internet.
You’ve Got Mail: The Beginnings of the Electronic Mailbox
E-mail has, in one sense or another, been around for quite a while. Originally, electronic messages were recorded within a single mainframe computer system. Each person working on the computer would have a personal folder, so sending that person a message required nothing more than creating a new document in that person’s folder. It was just like leaving a note on someone’s desk (Peter, 2004), so that the person would see it when he or she logged onto the computer.
However, once networks began to develop, things became slightly more complicated. Computer programmer Ray Tomlinson is credited with inventing the naming system we have today, using the @ symbol to denote the server (or host, from the previous section). In other words, name@gmail.com tells the host “gmail.com” (Google’s e-mail server) to drop the message into the folder belonging to “name.” Tomlinson is credited with writing the first network e-mail using his program SNDMSG in 1971. This invention of a simple standard for e-mail is often cited as one of the most important factors in the rapid spread of the Internet, and is still one of the most widely used Internet services.
The use of e-mail grew in large part because of later commercial developments, especially America Online, that made connecting to e-mail much easier than it had been at its inception. Internet service providers (ISPs) packaged e-mail accounts with Internet access, and almost all web browsers (such as Netscape, discussed later in the section) included a form of e-mail service. In addition to the ISPs, e-mail services like Hotmail and Yahoo! Mail provided free e-mail addresses paid for by small text ads at the bottom of every e-mail message sent. These free “webmail” services soon expanded to comprise a large part of the e-mail services that are available today. Far from the original maximum inbox sizes of a few megabytes, today’s e-mail services, like Google’s Gmail service, generally provide gigabytes of free storage space.
E-mail has revolutionized written communication. The speed and relatively inexpensive nature of e-mail makes it a prime competitor of postal services—including FedEx and UPS—that pride themselves on speed. Communicating via e-mail with someone on the other end of the world is just as quick and inexpensive as communicating with a next-door neighbor. However, the growth of Internet shopping and online companies such as Amazon.com has in many ways made the postal service and shipping companies more prominent—not necessarily for communication, but for delivery and remote business operations.
Hypertext: Web 1.0
In 1989, Tim Berners-Lee, a graduate of Oxford University and software engineer at CERN (the European particle physics laboratory), had the idea of using a new kind of protocol to share documents and information throughout the local CERN network. Instead of transferring regular text-based documents, he created a new language called hypertext markup language (HTML). Hypertext was a new word for text that goes beyond the boundaries of a single document. Hypertext can include links to other documents (hyperlinks), text-style formatting, images, and a wide variety of other components. The basic idea is that documents can be constructed out of a variety of links and can be viewed just as if they are on the user’s computer.
This new language required a new communication protocol so that computers could interpret it, and Berners-Lee decided on the name hypertext transfer protocol (HTTP). Through HTTP, hypertext documents can be sent from computer to computer and can then be interpreted by a browser, which turns the HTML files into readable web pages. The browser that Berners-Lee created, called World Wide Web, was a combination browser-editor, allowing users to view other HTML documents and create their own (Berners-Lee, 2009).
Modern browsers, like Microsoft Internet Explorer and Mozilla Firefox, only allow for the viewing of web pages; other increasingly complicated tools are now marketed for creating web pages, although even the most complicated page can be written entirely from a program like Windows Notepad. The reason web pages can be created with the simplest tools is the adoption of certain protocols by the most common browsers. Because Internet Explorer, Firefox, Apple Safari, Google Chrome, and other browsers all interpret the same code in more or less the same way, creating web pages is as simple as learning how to speak the language of these browsers.
In 1991, the same year that Berners-Lee created his web browser, the Internet connection service Q-Link was renamed America Online, or AOL for short. This service would eventually grow to employ over 20,000 people, on the basis of making Internet access available (and, critically, simple) for anyone with a telephone line. Although the web in 1991 was not what it is today, AOL’s software allowed its users to create communities based on just about any subject, and it only required a dial-up modem—a device that connects any computer to the Internet via a telephone line—and the telephone line itself.
In addition, AOL incorporated two technologies—chat rooms and Instant Messenger—into a single program (along with a web browser). Chat rooms allowed many users to type live messages to a “room” full of people, while Instant Messenger allowed two users to communicate privately via text-based messages. The most important aspect of AOL was its encapsulation of all these once-disparate programs into a single user-friendly bundle. Although AOL was later disparaged for customer service issues like its users’ inability to deactivate their service, its role in bringing the Internet to mainstream users was instrumental (Zeller Jr., 2005).
In contrast to AOL’s proprietary services, the World Wide Web had to be viewed through a standalone web browser. The first of these browsers to make its mark was the program Mosaic, released by the National Center for Supercomputing Applications at the University of Illinois. Mosaic was offered for free and grew very quickly in popularity due to features that now seem integral to the web. Things like bookmarks, which allow users to save the location of particular pages without having to remember them, and images, now an integral part of the web, were all inventions that made the web more usable for many people (National Center for Supercomputing Appliances).
Although the web browser Mosaic has not been updated since 1997, developers who worked on it went on to create Netscape Navigator, an extremely popular browser during the 1990s. AOL later bought the Netscape company, and the Navigator browser was discontinued in 2008, largely because Netscape Navigator had lost the market to Microsoft’s Internet Explorer web browser, which came preloaded on Microsoft’s ubiquitous Windows operating system. However, Netscape had long been converting its Navigator software into an open-source program called Mozilla Firefox, which is now the second-most-used web browser on the Internet (detailed in Table 11.1 “Browser Market Share (as of February 2010)”) (NetMarketshare). Firefox represents about a quarter of the market—not bad, considering its lack of advertising and Microsoft’s natural advantage of packaging Internet Explorer with the majority of personal computers.
For Sale: The Web
As web browsers became more available as a less-moderated alternative to AOL’s proprietary service, the web became something like a free-for-all of startup companies. The web of this period, often referred to as Web 1.0, featured many specialty sites that used the Internet’s ability for global, instantaneous communication to create a new type of business. Another name for this free-for-all of the 1990s is the “dot-com boom.” During the boom, it seemed as if almost anyone could build a website and sell it for millions of dollars. However, the “dot-com crash” that occurred later that decade seemed to say otherwise. Quite a few of these Internet startup companies went bankrupt, taking their shareholders down with them. Alan Greenspan, then the chairman of the U.S. Federal Reserve, called this phenomenon “irrational exuberance (Greenspan, 1996),” in large part because investors did not necessarily know how to analyze these particular business plans, and companies that had never turned a profit could be sold for millions. The new business models of the Internet may have done well in the stock market, but they were not necessarily sustainable. In many ways, investors collectively failed to analyze the business prospects of these companies, and once they realized their mistakes (and the companies went bankrupt), much of the recent market growth evaporated. The invention of new technologies can bring with it the belief that old business tenets no longer apply, but this dangerous belief—the “irrational exuberance” Greenspan spoke of—is not necessarily conducive to long-term growth.
Some lucky dot-com businesses formed during the boom survived the crash and are still around today. For example, eBay, with its online auctions, turned what seemed like a dangerous practice (sending money to a stranger you met over the Internet) into a daily occurrence. A less-fortunate company, eToys.com, got off to a promising start—its stock quadrupled on the day it went public in 1999—but then filed for Chapter 11 “The Internet and Social Media” bankruptcy in 2001 (Barnes, 2001).
One of these startups, theGlobe.com, provided one of the earliest social networking services that exploded in popularity. When theGlobe.com went public, its stock shot from a target price of $9 to a close of $63.50 a share (Kawamoto, 1998). The site itself was started in 1995, building its business on advertising. As skepticism about the dot-com boom grew and advertisers became increasingly skittish about the value of online ads, theGlobe.com ceased to be profitable and shut its doors as a social networking site (The Globe, 2009). Although advertising is pervasive on the Internet today, the current model—largely based on the highly targeted Google AdSense service—did not come around until much later. In the earlier dot-com years, the same ad might be shown on thousands of different web pages, whereas now advertising is often specifically targeted to the content of an individual page.
However, that did not spell the end of social networking on the Internet. Social networking had been going on since at least the invention of Usenet in 1979 (detailed later in the chapter), but the recurring problem was always the same: profitability. This model of free access to user-generated content departed from almost anything previously seen in media, and revenue streams would have to be just as radical.
The Early Days of Social Media
The shared, generalized protocols of the Internet have allowed it to be easily adapted and extended into many different facets of our lives. The Internet shapes everything, from our day-to-day routine—the ability to read newspapers from around the world, for example—to the way research and collaboration are conducted. There are three important aspects of communication that the Internet has changed, and these have instigated profound changes in the way we connect with one another socially: the speed of information, the volume of information, and the “democratization” of publishing, or the ability of anyone to publish ideas on the web.
One of the Internet’s largest and most revolutionary changes has come about through social networking. Because of Twitter, we can now see what all our friends are doing in real time; because of blogs, we can consider the opinions of complete strangers who may never write in traditional print; and because of Facebook, we can find people we haven’t talked to for decades, all without making a single awkward telephone call.
Recent years have seen an explosion of new content and services; although the phrase “social media” now seems to be synonymous with websites like Facebook and Twitter, it is worthwhile to consider all the ways a social media platform affects the Internet experience.
How Did We Get Here? The Late 1970s, Early 1980s, and Usenet
Almost as soon as TCP stitched the various networks together, a former DARPA scientist named Larry Roberts founded the company Telnet, the first commercial packet-switching company. Two years later, in 1977, the invention of the dial-up modem (in combination with the wider availability of personal computers like the Apple II) made it possible for anyone around the world to access the Internet. With availability extended beyond purely academic and military circles, the Internet quickly became a staple for computer hobbyists.
One of the consequences of the spread of the Internet to hobbyists was the founding of Usenet. In 1979, University of North Carolina graduate students Tom Truscott and Jim Ellis connected three computers in a small network and used a series of programming scripts to post and receive messages. In a very short span of time, this system spread all over the burgeoning Internet. Much like an electronic version of community bulletin boards, anyone with a computer could post a topic or reply on Usenet.
The group was fundamentally and explicitly anarchic, as outlined by the posting “What is Usenet?” This document says, “Usenet is not a democracy…there is no person or group in charge of Usenet …Usenet cannot be a democracy, autocracy, or any other kind of ‘-acy (Moraes, et. al., 1998).’” Usenet was not used only for socializing, however, but also for collaboration. In some ways, the service allowed a new kind of collaboration that seemed like the start of a revolution: “I was able to join rec.kites and collectively people in Australia and New Zealand helped me solve a problem and get a circular two-line kite to fly,” one user told the United Kingdom’s Guardian (Jeffery, et. al., 2009).
GeoCities: Yahoo! Pioneers
Fast-forward to 1995: The president and founder of Beverly Hills Internet, David Bohnett, announces that the name of his company is now “GeoCities.” GeoCities built its business by allowing users (“homesteaders”) to create web pages in “communities” for free, with the stipulation that the company placed a small advertising banner at the top of each page. Anyone could register a GeoCities site and subsequently build a web page about a topic. Almost all of the community names, like Broadway (live theater) and Athens (philosophy and education), were centered on specific topics (Archive, 1996).
This idea of centering communities on specific topics may have come from Usenet. In Usenet, the domain alt.rec.kites refers to a specific topic (kites) within a category (recreation) within a larger community (alternative topics). This hierarchical model allowed users to organize themselves across the vastness of the Internet, even on a large site like GeoCities. The difference with GeoCities was that it allowed users to do much more than post only text (the limitation of Usenet), while constraining them to a relatively small pool of resources. Although each GeoCities user had only a few megabytes of web space, standardized pictures—like mailbox icons and back buttons—were hosted on GeoCities’s main server. GeoCities was such a large part of the Internet, and these standard icons were so ubiquitous, that they have now become a veritable part of the Internet’s cultural history. The Web Elements category of the site Internet Archaeology is a good example of how pervasive GeoCities graphics became (Internet Archaeology, 2010).
GeoCities built its business on a freemium model, where basic services are free but subscribers pay extra for things like commercial pages or shopping carts. Other Internet businesses, like Skype and Flickr, use the same model to keep a vast user base while still profiting from frequent users. Since loss of online advertising revenue was seen as one of the main causes of the dot-com crash, many current web startups are turning toward this freemium model to diversify their income streams (Miller, 2009).
GeoCities’s model was so successful that the company Yahoo! bought it for $3.6 billion at its peak in 1999. At the time, GeoCities was the third-most-visited site on the web (behind Yahoo! and AOL), so it seemed like a sure bet. A decade later, on October 26, 2009, Yahoo! closed GeoCities for good in every country except Japan.
Diversification of revenue has become one of the most crucial elements of Internet businesses; from The Wall Street Journal online to YouTube, almost every website is now looking for multiple income streams to support its services.
Key Takeaways
- The two primary characteristics of the original Internet were decentralization and free, open protocols that anyone could use. As a result of its decentralized “web” model of organization, the Internet can store data in many different places at once. This makes it very useful for backing up data and very difficult to destroy data that might be unwanted. Protocols play an important role in this, because they allow some degree of control to exist without a central command structure.
- Two of the most important technological developments were the personal computer (such as the Apple II) and the dial-up modem, which allowed anyone with a phone line to access the developing Internet. America Online also played an important role, making it very easy for practically anyone with a computer to use the Internet. Another development, the web browser, allowed for access to and creation of web pages all over the Internet.
- With the advent of the web browser, it seemed as if anyone could make a website that people wanted to use. The problem was that these sites were driven largely by venture capital and grossly inflated initial public offerings of their stock. After failing to secure any real revenue stream, their stock plummeted, the market crashed, and many of these companies went out of business. In later years, companies tried to diversify their investments, particularly by using a “freemium” model of revenue, in which a company would both sell premium services and advertise, while offering a free pared-down service to casual users.
Exercises
Websites have many different ways of paying for themselves, and this can say a lot about both the site and its audience. The business models of today’s websites may also directly reflect the lessons learned during the early days of the Internet. Start this exercise by reviewing a list of common ways that websites pay for themselves, how they arrived at these methods, and what it might say about them:
- Advertising: The site probably has many casual viewers and may not necessarily be well established. If there are targeted ads (such as ads directed toward stay-at-home parents with children), then it is possible the site is successful with a small audience.
- Subscription option: The site may be a news site that prides itself on accuracy of information or lack of bias, whose regular readers are willing to pay a premium for the guarantee of quality material. Alternately, the site may cater to a small demographic of Internet users by providing them with exclusive, subscription-only content.
- Selling services: Online services, such as file hosting, or offline services and products are probably the clearest way to determine a site’s revenue stream. However, these commercial sites often are not prized for their unbiased information, and their bias can greatly affect the content on the site.
Choose a website that you visit often, and list which of these revenue streams the site might have. How might this affect the content on the site? Is there a visible effect, or does the site try to hide it? Consider how events during the early history of the Internet may have affected the way the site operates now. Write down a revenue stream that the site does not currently have and how the site designers might implement such a revenue stream.
References
Archive, While GeoCities is no longer in business, the Internet Archive maintains the site at http://www.archive.org/web/geocities.php. Information taken from December 21, 1996.
Barnes, Cecily. “eToys files for Chapter 11,” CNET, March 7, 2001, http://news.cnet.com/2100-1017-253706.html.
Berners-Lee, Tim. “The WorldWideWeb Browser,” 2009, https://www.w3.org/People/Berners-Lee/WorldWideWeb.
Central Intelligence Agency, “Country Comparison: Internet Hosts,” World Factbook, https://www.cia.gov/library/publications/the-world-factbook/rankorder/2184rank.html.
Cerf, Vincton, Yogen Dalal, and Carl Sunshine, “Specification of Internet Transmission Control Program,” December 1974, http://tools.ietf.org/html/rfc675.
Greenspan, Alan. “The Challenge of Central Banking in a Democratic Society, ” (lecture, American Enterprise Institute for Public Policy Research, Washington, DC, December 5, 1996), http://www.federalreserve.gov/boarddocs/speeches/1996/19961205.htm.
Internet Archaeology, 2010, http://www.internetarchaeology.org/swebelements.htm.
Jeffery, Simon and others, “A People’s History of the Internet: From Arpanet in 1969 to Today,” Guardian (London), October 23, 2009, http://www.guardian.co.uk/technology/interactive/2009/oct/23/internet-arpanet.
Kawamoto, Dawn. “TheGlobe.com’s IPO one for the books,” CNET, November 13, 1998, http://news.cnet.com/2100-1023-217913.html.
Miller, Claire Cain. “Ad Revenue on the Web? No Sure Bet,” New York Times, May 24, 2009, http://www.nytimes.com/2009/05/25/technology/start-ups/25startup.html.
Moraes, Mark, Chip Salzenberg, and Gene Spafford, “What is Usenet?” December 28, 1999, http://www.faqs.org/faqs/usenet/what-is/part1/.
National Center for Supercomputing Appliances, “About NCSA Mosaic,” 2010, http://www.ncsa.illinois.edu/Projects/mosaic.html.
NetMarketShare, “Browser Market Share,” http://marketshare.hitslink.com/browser-market-share.aspx?qprid=0&qpcal=1&qptimeframe=M&qpsp=132.
Peter, Ian. “The History of Email,” The Internet History Project, 2004, http://www.nethistory.info/History%20of%20the%20Internet/email.html.
The Globe, theglobe.com, “About Us,” 2009, http://www.theglobe.com/.
Zeller, Jr., Tom. “Canceling AOL? Just Offer Your Firstborn,” New York Times, August 29, 2005, allhttp://www.nytimes.com/2005/08/29/technology/29link.html.
11.3 Social Media and Web 2.0
Learning Objectives
- Identify the major social networking sites, and give possible uses and demographics for each one.
- Show the positive and negative effects of blogs on the distribution and creation of information.
- Explain the ways privacy has been addressed on the Internet.
- Identify new information that marketers can use because of social networking.
Although GeoCities lost market share, and although theGlobe.com never really made it to the 21st century, social networking has persisted. There are many different types of social media available today, from social networking sites like Facebook to blogging services like Blogger and WordPress.com. All these sites bring something different to the table, and a few of them even try to bring just about everything to the table at once.
Social Networking
Social networking services—like Facebook, Twitter, LinkedIn, Google Buzz, and MySpace—provide a limited but public platform for users to create a “profile.” This can range anywhere from the 140-character (that’s letters and spaces, not words) “tweets” on Twitter, to the highly customizable MySpace, which allows users to blog, customize color schemes, add background images, and play music. Each of these services has its key demographic—MySpace, for example, is particularly geared toward younger users. Its huge array of features made it attractive to this demographic at first, but eventually it was overrun with corporate marketing and solicitations for pornographic websites, leading many users to abandon the service. In addition, competing social networking sites like Facebook offer superior interfaces that have lured away many of MySpace’s users. MySpace has attempted to catch up by upgrading its own interface, but it now faces the almost insurmountable obstacle of already-satisfied users of competing social networking services. As Internet technology evolves rapidly, most users have few qualms about moving to whichever site offers the better experience; most users have profiles and accounts on many services at once. But as relational networks become more and more established and concentrated on a few social media sites, it becomes increasingly difficult for newcomers and lagging challengers to offer the same rich networking experience. For a Facebook user with hundreds of friends in his or her social network, switching to MySpace and bringing along his or her entire network of friends is a daunting and infeasible prospect. Google has attempted to circumvent the problem of luring users to create new social networks by building its Buzz service into its popular Gmail, ensuring that Buzz has a built-in user base and lowering the social costs of joining a new social network by leveraging users’ Gmail contact lists. It remains to be seen if Google will be truly successful in establishing a vital new social networking service, but its tactic of integrating Buzz into Gmail underscores how difficult it has become to compete with established social networks like Twitter and Facebook.
Whereas MySpace initially catered to a younger demographic, LinkedIn caters to business professionals looking for networking opportunities. LinkedIn is free to join and allows users to post resumes and job qualifications (rather than astrological signs and favorite TV shows). Its tagline, “Relationships matter,” emphasizes the role of an increasingly networked world in business; just as a musician might use MySpace to promote a new band, a LinkedIn user can use the site to promote professional services. While these two sites have basically the same structure, they fulfill different purposes for different social groups; the character of social networking is highly dependent on the type of social circle.
Twitter offers a different approach to social networking, allowing users to “tweet” 140-character messages to their “followers,” making it something of a hybrid of instant messaging and blogging. Twitter is openly searchable, meaning that anyone can visit the site and quickly find out what other Twitter users are saying about any subject. Twitter has proved useful for journalists reporting on breaking news, as well as highlighting the “best of” the Internet. Twitter has also been useful for marketers looking for a free public forum to disseminate marketing messages. It became profitable in December 2009 through a $25 million deal allowing Google and Microsoft to display its users’ 140-character messages in their search results (Van Buskirk, 2009). Facebook, originally deployed exclusively to Ivy League schools, has since opened its doors to anyone over 13 with an e-mail account. With the explosion of the service and its huge growth among older demographics, “My parents joined Facebook” has become a common complaint (My Parents Joined Facebook).
Another category of social media, blogs began as an online, public version of a diary or journal. Short for “web logs,” these personal sites give anyone a platform to write about anything they want to. Posting tweets on the Twitter service is considered micro-blogging (because of the extremely short length of the posts). Some services, like LiveJournal, highlight their ability to provide up-to-date reports on personal feelings, even going so far as to add a “mood” shorthand at the end of every post. The Blogger service (now owned by Google) allows users with Google accounts to follow friends’ blogs and post comments. WordPress.com, the company that created the open-source blogging platform WordPress.org, and LiveJournal both follow the freemium model by allowing a basic selection of settings for free, with the option to pay for things like custom styles and photo hosting space. What these all have in common, however, is their bundling of social networking (such as the ability to easily link to and comment on friends’ blogs) with an expanded platform for self-expression. At this point, most traditional media companies have incorporated blogs, Twitter, and other social media as a way to allow their reporters to update instantly and often. This form of media convergence, discussed in detail in Section 11.3 “The Effects of the Internet and Globalization on Popular Culture and Interpersonal Communication” of this chapter, is now a necessary part of doing business.
There are many other types of social media out there, many of which can be called to mind with a single name: YouTube (video sharing), Wikipedia (open-source encyclopedia composed of “wikis” editable by any user), Flickr (photo sharing), and Digg (content sharing). Traditional media outlets have begun referring to these social media services and others like them as “Web 2.0.” Web 2.0 is not a new version of the web; rather, the term is a reference to the increased focus on user-generated content and social interaction on the web, as well as the evolution of online tools to facilitate that focus. Instead of relying on professional reporters to get information about a protest in Iran, a person could just search for “Iran” on Twitter and likely end up with hundreds of tweets linking to everything from blogs to CNN.com to YouTube videos from Iranian citizens themselves. In addition, many of these tweets may actually be instant updates from people using Twitter in Iran. This allows people to receive information straight from the source, without being filtered through news organizations or censored by governments.
Going Viral
In 2009, Susan Boyle, an unemployed middle-aged Scottish woman, appeared on Britain’s Got Talent and sang “I Dreamed a Dream” from the musical Les Miserables, becoming an international star almost overnight. It was not her performance itself that catapulted her to fame and sent her subsequently released album to the top of the UK Billboard charts and kept it there for 6 weeks. What did it was a YouTube video of her performance, viewed by 87,000,000 people and counting (Youtube, 2009).
Media that is spread from person to person when, for example, a friend sends you a link saying “You’ve got to see this!” is said to have “gone viral.” Marketing and advertising agencies have deemed advertising that makes use of this phenomenon as “viral marketing.” Yet many YouTube sensations have not come from large marketing firms. For instance, the four-piece pop-punk band OK Go filmed a music video on a tiny budget for their song “Here It Goes Again” and released it exclusively on YouTube in 2006. Featuring a choreographed dance done on eight separate treadmills, the video quickly became a viral sensation and, as of May 2011, has over 7,265,825 views. The video helped OK Go attract millions of new fans and earned them a Grammy award in 2007, making it one of the most notable successes of viral Internet marketing. Viral marketing is, however, notoriously unpredictable and is liable to spawn remixes, spinoffs, and spoofs that can dilute or damage the messages that marketers intend to spread. Yet, when it is successful, viral marketing can reach millions of people for very little money and can even make it into the mainstream news.
Recent successes and failures in viral marketing demonstrate how difficult it is for marketers to control their message as it is unleashed virally. In 2007, the band Radiohead released their album In Rainbows online, allowing fans to download it for any amount of money they chose—including for free. Despite practically giving the album away, the digital release of In Rainbows still pulled in more money than Radiohead’s previous album, Hail to the Thief, while the band simultaneously sold a huge number of $80 collector editions and still sold physical CDs months after the digital release became available (New Musical Express, 2008). In contrast, the food giant Healthy Choice enlisted Classymommy.com blogger Colleen Padilla to write a sponsored review of its product, leading to a featured New York Times article on the blogger (not the product), which gave the product only a passing mention (Joshi, 2009). Often, a successfully marketed product will reach some people through the Internet and then break through into the mainstream media. Yet as the article about Padilla shows, sometimes the person writing about the product overshadows the product itself.
Not all viral media is marketing, however. In 2007, someone posted a link to a new trailer for Grand Theft Auto IV on the video games message board of the web forum 4chan.org. When users followed the link, they were greeted not with a video game trailer but with Rick Astley singing his 1987 hit “Never Gonna Give You Up.” This technique—redirecting someone to that particular music video—became known as Rickrolling and quickly became one of the most well-known Internet memes of all time (Fox News, 2008). An Internet meme is a concept that quickly replicates itself throughout the Internet, and it is often nonsensical and absurd. Another meme, “Lolcats,” consists of misspelled captions—“I can has cheezburger?” is a classic example—over pictures of cats. Often, these memes take on a metatextual quality, such as the meme “Milhouse is not a meme,” in which the character Milhouse (from the TV show The Simpsons) is told that he is not a meme. Chronicling memes is notoriously difficult, because they typically spring into existence seemingly overnight, propagate rapidly, and disappear before ever making it onto the radar of mainstream media—or even the mainstream Internet user.
Benefits and Problems of Social Media
Social media allows an unprecedented volume of personal, informal communication in real time from anywhere in the world. It allows users to keep in touch with friends on other continents, yet keeps the conversation as casual as a Facebook wall post. In addition, blogs allow us to gauge a wide variety of opinions and have given “breaking news” a whole new meaning. Now, news can be distributed through many major outlets almost instantaneously, and different perspectives on any one event can be aired concurrently. In addition, news organizations can harness bloggers as sources of real-time news, in effect outsourcing some of their news-gathering efforts to bystanders on the scene. This practice of harnessing the efforts of several individuals online to solve a problem is known as crowdsourcing.
The downside of the seemingly infinite breadth of online information is that there is often not much depth to the coverage of any given topic. The superficiality of information on the Internet is a common gripe among many journalists who are now rushed to file news reports several times a day in an effort to complete with the “blogosphere,” or the crowd of bloggers who post both original news stories and aggregate previously published news from other sources. Whereas traditional print organizations at least had the “luxury” of the daily print deadline, now journalists are expected to blog or tweet every story and file reports with little or no analysis, often without adequate time to confirm the reliability of their sources (Auletta, 2010).
Additionally, news aggregators like Google News profit from linking to journalists’ stories at major newspapers and selling advertising, but these profits are not shared with the news organizations and journalists who created the stories. It is often difficult for journalists to keep up with the immediacy of the nonstop news cycle, and with revenues for their efforts being diverted to news aggregators, journalists and news organizations increasingly lack the resources to keep up this fast pace. Twitter presents a similar problem: Instead of getting news from a specific newspaper, many people simply read the articles that are linked from a Twitter feed. As a result, the news cycle leaves journalists no time for analysis or cross-examination. Increasingly, they will simply report, for example, on what a politician or public relations representative says without following up on these comments or fact-checking them. This further shortens the news cycle and makes it much easier for journalists to be exploited as the mouthpieces of propaganda.
Consequently, the very presence of blogs and their seeming importance even among mainstream media has made some critics wary. Internet entrepreneur Andrew Keen is one of these people, and his book The Cult of the Amateur follows up on the famous thought experiment suggesting that infinite monkeys, given infinite typewriters, will one day randomly produce a great work of literature (Huxley): “In our Web 2.0 world, the typewriters aren’t quite typewriters, but rather networked personal computers, and the monkeys aren’t quite monkeys, but rather Internet users (Keen, 2007).” Keen also suggests that the Internet is really just a case of my-word-against-yours, where bloggers are not required to back up their arguments with credible sources. “These days, kids can’t tell the difference between credible news by objective professional journalists and what they read on [a random website] (Keen, 2007).” Commentators like Keen worry that this trend will lead to young people’s inability to distinguish credible information from a mass of sources, eventually leading to a sharp decrease in credible sources of information.
For defenders of the Internet, this argument seems a bit overwrought: “A legitimate interest in the possible effects of significant technological change in our daily lives can inadvertently dovetail seamlessly into a ‘kids these days’ curmudgeonly sense of generational degeneration, which is hardly new (Downey, 2009).” Greg Downey, who runs the collaborative blog Neuroanthropology, says that fear of kids on the Internet—and on social media in particular—can slip into “a ‘one-paranoia-fits-all’ approach to technological change.” For the argument that online experiences are “devoid of cohesive narrative and long-term significance,” Downey offers that, on the contrary, “far from evacuating narrative, some social networking sites might be said to cause users to ‘narrativize’ their experience, engaging with everyday life already with an eye toward how they will represent it on their personal pages.”
Another argument in favor of social media defies the warning that time spent on social networking sites is destroying the social skills of young people. “The debasement of the word ‘friend’ by [Facebook’s] use of it should not make us assume that users can’t tell the difference between friends and Facebook ‘friends,’” writes Downey. On the contrary, social networks (like the Usenet of the past) can even provide a place for people with more obscure interests to meet one another and share commonalities. In addition, marketing through social media is completely free—making it a valuable tool for small businesses with tight marketing budgets. A community theater can invite all of its “fans” to a new play for less money than putting an ad in the newspaper, and this direct invitation is far more personal and specific. Many people see services like Twitter, with its “followers,” as more semantically appropriate than the “friends” found on Facebook and MySpace, and because of this Twitter has, in many ways, changed yet again the way social media is conceived. Rather than connecting with “friends,” Twitter allows social media to be purely a source of information, thereby making it far more appealing to adults. In addition, while 140 characters may seem like a constraint to some, it can be remarkably useful to the time-strapped user looking to catch up on recent news.
Social media’s detractors also point to the sheer banality of much of the conversation on the Internet. Again, Downey keeps this in perspective: “The banality of most conversation is also pretty frustrating,” he says. Downey suggests that many of the young people using social networking tools see them as just another aspect of communication. However, Downey warns that online bullying has the potential to pervade larger social networks while shielding perpetrators through anonymity.
Another downside of many of the Internet’s segmented communities is that users tend to be exposed only to information they are interested in and opinions they agree with. This lack of exposure to novel ideas and contrary opinions can create or reinforce a lack of understanding among people with different beliefs, and make political and social compromise more difficult to come by.
While the situation may not be as dire as Keen suggests in his book, there are clearly some important arguments to consider regarding the effects of the web and social media in particular. The main concerns come down to two things: the possibility that the volume of amateur, user-generated content online is overshadowing better-researched sources, and the questionable ability of users to tell the difference between the two.
Education, the Internet, and Social Media
Although Facebook began at Harvard University and quickly became popular among the Ivy League colleges, the social network has since been lambasted as a distraction for students. Instead of studying, the argument claims, students will sit in the library and browse Facebook, messaging their friends and getting nothing done. Two doctoral candidates, Aryn Karpinski (Ohio State University) and Adam Duberstein (Ohio Dominican University), studied the effects of Facebook use on college students and found that students who use Facebook generally receive a full grade lower—a half point on the GPA scale—than students who do not (Hamilton, 2009). Correlation does not imply causation, though, as Karpinski said that Facebook users may just be “prone to distraction.”
On the other hand, students’ access to technology and the Internet may allow them to pursue their education to a greater degree than they could otherwise. At a school in Arizona, students are issued laptops instead of textbooks, and some of their school buses have Wi-Fi Internet access. As a result, bus rides, including the long trips that are often a requirement of high school sports, are spent studying. Of course, the students had laptops long before their bus rides were connected to the Internet, but the Wi-Fi technology has “transformed what was often a boisterous bus ride into a rolling study hall (Dillon, 2010).” Even though not all students studied all the time, enabling students to work on bus rides fulfilled the school’s goal of extending the educational hours beyond the usual 8 a.m. to 3 p.m.
Privacy Issues With Social Networking
Social networking provides unprecedented ways to keep in touch with friends, but that ability can sometimes be a double-edged sword. Users can update friends with every latest achievement—“[your name here] just won three straight games of solitaire!”—but may also unwittingly be updating bosses and others from whom particular bits of information should be hidden.
The shrinking of privacy online has been rapidly exacerbated by social networks, and for a surprising reason: conscious decisions made by participants. Putting personal information online—even if it is set to be viewed by only select friends—has become fairly standard. Dr. Kieron O’Hara studies privacy in social media and calls this era “intimacy 2.0 (Kleinman, 2010),” a riff on the buzzword “Web 2.0.” One of O’Hara’s arguments is that legal issues of privacy are based on what is called a “reasonable standard.” According to O’Hara, the excessive sharing of personal information on the Internet by some constitutes an offense to the privacy of all, because it lowers the “reasonable standard” that can be legally enforced. In other words, as cultural tendencies toward privacy degrade on the Internet, it affects not only the privacy of those who choose to share their information, but also the privacy of those who do not.
Privacy Settings on Facebook
With over 500 million users, it is no surprise that Facebook is one of the upcoming battlegrounds for privacy on the Internet. When Facebook updated its privacy settings in 2009 for these people, “privacy groups including the American Civil Liberties Union…[called] the developments ‘flawed’ and ‘worrisome,’” reported The Guardian in late 2009 (Johnson, 2009).
Mark Zuckerberg, the founder of Facebook, discusses privacy issues on a regular basis in forums ranging from his official Facebook blog to conferences. At the Crunchies Awards in San Francisco in early 2010, Zuckerberg claimed that privacy was no longer a “social norm (Johnson, 2010).” This statement follows from his company’s late-2009 decision to make public information sharing the default setting on Facebook. Whereas users were previously able to restrict public access to basic profile information like their names and friends, the new settings make this information publicly available with no option to make it private. Although Facebook publicly announced the changes, many outraged users first learned of the updates to the default privacy settings when they discovered—too late—that they had inadvertently broadcast private information. Facebook argues that the added complexity of the privacy settings gives users more control over their information. However, opponents counter that adding more complex privacy controls while simultaneously making public sharing the default setting for those controls is a blatant ploy to push casual users into sharing more of their information publicly—information that Facebook will then use to offer more targeted advertising (Bankston, 2009).
In response to the privacy policy, many users have formed their own grassroots protest groups within Facebook. In response to critiques, Facebook changed its privacy policy again in May 2010 with three primary changes. First, privacy controls are simpler. Instead of various controls on multiple pages, there is now one main control users can use to determine who can see their information. Second, Facebook made less information publicly available. Public information is now limited to basic information, such as a user’s name and profile picture. Finally, it is now easier to block applications and third-party websites from accessing user information (Lake, 2010).
Similar to the Facebook controversy, Google’s social networking Gmail add-on called Buzz automatically signed up Gmail users to “follow” the most e-mailed Gmail users in their address book. Because all of these lists were public by default, users’ most e-mailed contacts were made available for anyone to see. This was especially alarming for people like journalists who potentially had confidential sources exposed to a public audience. However, even though this mistake—which Google quickly corrected—created a lot of controversy around Buzz, it did not stop users from creating over 9 million posts in the first 2 days of the service (Jackson, 2010). Google’s integration of Buzz into its Gmail service may have been upsetting to users not accustomed to the pitfalls of social networking, but Google’s misstep has not discouraged millions of others from trying the service, perhaps due to their experience dealing with Facebook’s ongoing issues with privacy infringement.
For example, Facebook’s old privacy settings integrated a collection of applications (written by third-party developers) that included everything from “Which American Idol Contestant Are You?” to an “Honesty Box” that allows friends to send anonymous criticism. “Allowing Honesty Box access will let it pull your profile information, photos, your friends’ info, and other content that it requires to work,” reads the disclaimer on the application installation page. The ACLU drew particular attention to the “app gap” that allowed “any quiz or application run by you to access information about you and your friends (Ozer, 2009).” In other words, merely using someone else’s Honesty Box gave the program information about your “religion, sexual orientation, political affiliation, pictures, and groups (Ozer, 2009).” There are many reasons that unrelated applications may want to collect this information, but one of the most prominent is, by now, a very old story: selling products. The more information a marketer has, the better he or she can target a message, and the more likely it is that the recipient will buy something.
Social Media’s Effect on Commerce
Social media on the Internet has been around for a while, and it has always been of some interest to marketers. The ability to target advertising based on demographic information given willingly to the service—age, political preference, gender, and location—allows marketers to target advertising extremely efficiently. However, by the time Facebook’s population passed the 350-million mark, marketers were scrambling to harness social media. The increasingly difficult-to-reach younger demographic has been rejecting radios for Apple’s iPod mobile digital devices and TV for YouTube. Increasingly, marketers are turning to social networks as a way to reach these consumers. Culturally, these developments indicate a mistrust among consumers of traditional marketing techniques; marketers must now use new and more personalized ways of reaching consumers if they are going to sell their products.
The attempts of marketers to harness the viral spread of media on the Internet have already been discussed earlier in the chapter. Marketers try to determine the trend of things “going viral,” with the goal of getting millions of YouTube views; becoming a hot topic on Google Trends, a website that measures the most frequently searched topics on the web; or even just being the subject of a post on a well-known blog. For example, Procter & Gamble sent free samples of its Swiffer dust mop to stay-at-home-mom bloggers with a large online audience. And in 2008, the movie College (or College: The Movie) used its tagline “Best.Weekend.Ever.” as the prompt for a YouTube video contest. Contestants were invited to submit videos of their best college weekend ever, and the winner received a monetary prize (Hickey, 2008).
What these two instances of marketing have in common is that they approach people who are already doing something they enjoy doing—blogging or making movies—and give them a relatively small amount of compensation for providing advertising. This differs from methods of traditional advertising because marketers seek to bridge a credibility gap with consumers. Marketers have been doing this for ages—long before breakfast cereal slogans like “Kid Tested, Mother Approved” or “Mikey likes it” ever hit the airwaves. The difference is that now the people pushing the products can be friends or family members, all via social networks.
For instance, in 2007, a program called Beacon was launched as part of Facebook. With Beacon, a Facebook user is confronted with the option to “share” an online purchase from partnering sites. For example, a user might buy a book from Amazon.com and check the corresponding “share” box in the checkout process, and all of his or her friends will receive a message notifying them that this person purchased and recommends this particular product. Explaining the reason for this shift in a New York Times article, Mark Zuckerberg said, “Nothing influences a person more than a trusted friend (Story, 2007).” However, many Facebook users did not want their purchasing information shared with other Facebookers, and the service was shut down in 2009 and subsequently became the subject of a class action lawsuit. Facebook’s troubles with Beacon illustrate the thin line between taking advantage of the tremendous marketing potential of social media and violating the privacy of users.
Facebook’s questionable alliance with marketers through Beacon was driven by a need to create reliable revenue streams. One of the most crucial aspects of social media is the profitability factor. In the 1990s, theGlobe.com was one of the promising new startups, but almost as quickly, it went under due to lack of funds. The lesson of theGlobe.com has not gone unheeded by today’s social media services. For example, Twitter has sold access to its content to Google and Microsoft to make users’ tweets searchable for $25 million.
Google’s Buzz is one of the most interesting services in this respect, because Google’s main business is advertising—and it is a highly successful business. Google’s search algorithms allow it to target advertising to a user’s specific tastes. As Google enters the social media world, its advertising capabilities will only be compounded as users reveal more information about themselves via Buzz. Although it does not seem that users choose their social media services based on how the services generate their revenue streams, the issue of privacy in social media is in large part an issue of how much information users are willing to share with advertisers. For example, using Google’s search engine, Buzz, Gmail, and Blogger give that single company an immense amount of information and a historically unsurpassed ability to market to specific groups. At this relatively early stage of the fledgling online social media business—both Twitter and Facebook only very recently turned a profit, so commerce has only recently come into play—it is impossible to say whether the commerce side of things will transform the way people use the services. If the uproar over Facebook’s Beacon is any lesson, however, the relationship between social media and advertising is ripe for controversy.
Social Media as a Tool for Social Change
The use of Facebook and Twitter in the recent political uprisings in the Middle East has brought to the fore the question whether social media can be an effective tool for social change.
On January 14, 2011, after month-long protests against fraud, economic crisis, and lack of political freedom, the Tunisian public ousted President Zine El Abidine Ben Ali. Soon after the Tunisian rebellion, the Egyptian public expelled President Hosni Mubarak, who had ruled the country for 30 years. Nearly immediately, other Middle Eastern countries such as Algeria, Libya, Yemen, and Bahrain also erupted against their oppressive governments in the hopes of obtaining political freedom (Gamba, 2011).
What is common among all these uprisings is the role played by social media. In nearly all of these countries, restrictions were imposed on the media and government resistance was brutally discouraged (Beaumont, 2011). This seems to have inspired the entire Middle East to organize online to rebel against tyrannical rule (Taylor, 2011). Protesters used social media not only to organize against their governments but also to share their struggles with the rest of the world (Gamba).
In Tunisia, protesters filled the streets by sharing information on Twitter (Taylor). Egypt’s protests were organized on Facebook pages. Details of the demonstrations were circulated by both Facebook and Twitter. E-mail was used to distribute the activists’ guide to challenging the regime (Beaumont). Libyan dissenters too spread the word about their demonstrations similarly (Taylor).
Owing to the role played by Twitter and Facebook in helping protesters organize and communicate with each other, many have termed these rebellions as “Twitter Revolutions (Morozov, 2011)” or “Facebook Revolutions (Davis, 2011)” and have credited social media for helping to bring down these regimes (Beardsley, 2011).
During the unrest, social media outlets such as Facebook and Twitter helped protesters share information by communicating ideas continuously and instantaneously. Users took advantage of these unrestricted vehicles to share the most graphic details and images of the attacks on protesters, and to rally demonstrators (Beaumont). In other words, use of social media was about the ability to communicate across borders and barriers. It gave common people a voice and an opportunity to express their opinions.
Critics of social media, however, say that those calling the Middle East movements Facebook or Twitter revolutions are not giving credit where it is due (Villarreal, 2011). It is true that social media provided vital assistance during the unrest in the Middle East. But technology alone could not have brought about the revolutions. The resolve of the people to bring about change was most important, and this fact should be recognized, say the critics (Taylor).
Key Takeaways
- Social networking sites often encompass many aspects of other social media. For example, Facebook began as a collection of profile pictures with very little information, but soon expanded to include photo albums (like Flickr) and micro-blogging (like Twitter). Other sites, like MySpace, emphasize connections to music and customizable pages, catering to a younger demographic. LinkedIn specifically caters to a professional demographic by allowing only certain kinds of information that is professionally relevant.
- Blogs speed the flow of information around the Internet and provide a critical way for nonprofessionals with adequate time to investigate sources and news stories without the necessary platform of a well-known publication. On the other hand, they can lead to an “echo chamber” effect, where they simply repeat one another and add nothing new. Often, the analysis is wide ranging, but it can also be shallow and lack the depth and knowledge of good critical journalism.
- Facebook has been the leader in privacy-related controversy, with its seemingly constant issues with privacy settings. One of the critical things to keep in mind is that as more people become comfortable with more information out in the open, the “reasonable standard” of privacy is lowered. This affects even people who would rather keep more things private.
- Social networking allows marketers to reach consumers directly and to know more about each specific consumer than ever before. Search algorithms allow marketers to place advertisements in areas that get the most traffic from targeted consumers. Whereas putting an ad on TV reaches all demographics, online advertisements can now be targeted specifically to different groups.
Exercises
- Draw a Venn diagram of two social networking sites mentioned in this chapter. Sign up for both of them (if you’re not signed up already) and make a list of their features and their interfaces. How do they differ? How are they the same?
- Write a few sentences about how a marketer might use these tools to reach different demographics.
References
Auletta, Ken. “Non-Stop News,” Annals of Communications, New Yorker, January 25, 2010, http://www.newyorker.com/reporting/2010/01/25/100125fa_fact_auletta.
Bankston, Kevin. “Facebook’s New Privacy Changes: The Good, the Bad, and the Ugly,” Deeplinks Blog, Electronic Frontier Foundation, December 9, 2009, http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly.
Beardsley, Eleanor. “Social Media Gets Credit for Tunisian Overthrow,” NPR, January 16, 2011, http://www.npr.org/2011/01/16/132975274/Social-Media-Gets-Credit-For-Tunisian-Overthrow.
Beaumont, “Can Social Networking Overthrow a Government?”
Beaumont, Peter. “Can Social Networking Overthrow a Government?” Morning Herald (Sydney), February 25, 2011, http://www.smh.com.au/technology/technology-news/can-social-networking-overthrow-a-government-20110225-1b7u6.html.
Davis, Eric. “Social Media: A Force for Political Change in Egypt,” April 13, 2011, http://new-middle-east.blogspot.com/2011/04/social-media-force-for-political-change.html.
Dillon, Sam. “Wi-Fi Turns Rowdy Bus Into Rolling Study Hall,” New York Times, February 11, 2010, http://www.nytimes.com/2010/02/12/education/12bus.html.
Downey, Greg. “Is Facebook Rotting Our Children’s Brains?” Neuroanthropology.net, March 2, 2009, http://neuroanthropology.net/2009/03/02/is-facebook-rotting-our-childrens-brains/.
Fox News, “The Biggest Little Internet Hoax on Wheels Hits Mainstream,” April 22, 2008, http://www.foxnews.com/story/0,2933,352010,00.html.
Gamba, “Facebook Topples Governments in Middle East.”
Gamba, Grace. “Facebook Topples Governments in Middle East,” Brimstone Online, March 18, 2011, http://www.gshsbrimstone.com/news/2011/03/18/facebook-topples-governments-in-middle-east.
Hamilton, Anita. “What Facebook Users Share: Lower Grades,” Time, April 14, 2009, http://www.time.com/time/business/article/0,8599,1891111,00.html.
Hickey, Jon. “Best Weekend Ever,” 2008, http://www.youtube.com/watch?v=pldG8MdEIOA.
Huxley, T. H. (the father of Aldous Huxley), this thought experiment suggests that infinite monkeys given infinite typewriters would, given infinite time, eventually write Hamlet.
Jackson, Todd. “Millions of Buzz users, and improvements based on your feedback,” Official Gmail Blog, February 11, 2010, http://gmailblog.blogspot.com/2010/02/millions-of-buzz-users-and-improvements.html.
Johnson, Bobbie. “Facebook Privacy Change Angers Campaigners,” Guardian (London), December 10, 2009, http://www.guardian.co.uk/technology/2009/dec/10/facebook-privacy.
Johnson, Bobbie. “Privacy No Longer a Social Norm, Says Facebook Founder,” Guardian (London), January 11, 2010, http://www.guardian.co.uk/technology/2010/jan/11/facebook-privacy.
Joshi, Pradnya. “Approval by a Blogger May Please a Sponsor,” New York Times, July 12, 2009, http://www.nytimes.com/2009/07/13/technology/internet/13blog.html.
Keen, Andrew. The Cult of the Amateur: How Today’s Internet Is Killing Our Culture (New York: Doubleday, 2007).
Kleinman, Zoe. “How Online Life Distorts Privacy Rights for All,” BBC News, January 8, 2010, http://news.bbc.co.uk/2/hi/technology/8446649.stm.
Lake, Maggie. “Facebook’s privacy changes,” CNN, June 2, 2010, http://www.cnn.com/video/#/video/tech/2010/05/27/lake.facebook.pr.
Morozov, Evgeny. “How Much Did Social Media Contribute to Revolution in the Middle East?” Bookforum, April/May 2011, http://www.bookforum.com/inprint/018_01/7222.
My Parents Joined Facebook, http://myparentsjoinedfacebook.com/ for examples on the subject.
New Musical Express, “Radiohead Reveal How Successful ‘In Rainbows’ Download Really Was,” October 15, 2008, http://www.nme.com/news/radiohead/40444.
Ozer, Nicole. “Facebook Privacy in Transition – But Where Is It Heading?” ACLU of Northern California, December 9, 2009, http://www.aclunc.org/issues/technology/blog/facebook_privacy_in_transition_-_but_where_is_it_heading.shtml.
Story, Louise. “Facebook Is Marketing Your Brand Preferences (With Your Permission),” New York Times, November 7, 2007, http://www.nytimes.com/2007/11/07/technology/07adco.html.
Taylor, “Why Not Call It a Facebook Revolution?”
Taylor, Chris. “Why Not Call It a Facebook Revolution?” CNN, February 24, 2011, http://edition.cnn.com/2011/TECH/social.media/02/24/facebook.revolution.
Van Buskirk, Eliot. “Twitter Earns First Profit Selling Search to Google, Microsoft,” Wired, December 21, 2009, http://www.wired.com/epicenter/2009/12/twitter-earns-first-profit-selling-search-to-google-microsoft.
Villarreal, Alex. “Social Media A Critical Tool for Middle East Protesters,” Voice of America, March 1, 2011, http://www.voanews.com/english/news/middle-east/Social-Media-a-Critical-Tool-for-Middle-East-Protesters-117202583.html.
Youtube, BritainsSoTalented, “Susan Boyle – Singer – Britains Got Talent 2009,” 2009, http://www.youtube.com/watch?v=9lp0IWv8QZY.
11.4 The Effects of the Internet and Globalization on Popular Culture and Interpersonal Communication
Learning Objectives
- Describe the effects of globalization on culture.
- Identify the possible effects of news migrating to the Internet.
- Define the Internet paradox.
It’s in the name: World Wide Web. The Internet has broken down communication barriers between cultures in a way that could only be dreamed of in earlier generations. Now, almost any news service across the globe can be accessed on the Internet and, with the various translation services available (like Babelfish and Google Translate), be relatively understandable. In addition to the spread of American culture throughout the world, smaller countries are now able to cheaply export culture, news, entertainment, and even propaganda.
The Internet has been a key factor in driving globalization in recent years. Many jobs can now be outsourced entirely via the Internet. Teams of software programmers in India can have a website up and running in very little time, for far less money than it would take to hire American counterparts. Communicating with these teams is now as simple as sending e-mails and instant messages back and forth, and often the most difficult aspect of setting up an international video conference online is figuring out the time difference. Especially for electronic services such as software, outsourcing over the Internet has greatly reduced the cost to develop a professionally coded site.
Electronic Media and the Globalization of Culture
The increase of globalization has been an economic force throughout the last century, but economic interdependency is not its only by-product. At its core, globalization is the lowering of economic and cultural impediments to communication between countries all over the globe. Globalization in the sphere of culture and communication can take the form of access to foreign newspapers (without the difficulty of procuring a printed copy) or, conversely, the ability of people living in previously closed countries to communicate experiences to the outside world relatively cheaply.
TV, especially satellite TV, has been one of the primary ways for American entertainment to reach foreign shores. This trend has been going on for some time now, for example, with the launch of MTV Arabia (Arango, 2008). American popular culture is, and has been, a crucial export.
At the Eisenhower Fellowship Conference in Singapore in 2005, U.S. ambassador Frank Lavin gave a defense of American culture that differed somewhat from previous arguments. It would not be all Starbucks, MTV, or Baywatch, he said, because American culture is more diverse than that. Instead, he said that “America is a nation of immigrants,” and asked, “When Mel Gibson or Jackie Chan come to the United States to produce a movie, whose culture is being exported (Lavin, 2005)?” This idea of a truly globalized culture—one in which content can be distributed as easily as it can be received—now has the potential to be realized through the Internet. While some political and social barriers still remain, from a technological standpoint there is nothing to stop the two-way flow of information and culture across the globe.
China, Globalization, and the Internet
The scarcity of artistic resources, the time lag of transmission to a foreign country, and censorship by the host government are a few of the possible impediments to transmission of entertainment and culture. China provides a valuable example of the ways the Internet has helped to overcome (or highlight) all three of these hurdles.
China, as the world’s most populous country and one of its leading economic powers, has considerable clout when it comes to the Internet. In addition, the country is ruled by a single political party that uses censorship extensively in an effort to maintain control. Because the Internet is an open resource by nature, and because China is an extremely well-connected country—with 22.5 percent (roughly 300 million people, or the population of the entire United States) of the country online as of 2008 (Google, 2010)—China has been a case study in how the Internet makes resistance to globalization increasingly difficult.
On January 21, 2010, Hillary Clinton gave a speech in front of the Newseum in Washington, DC, where she said, “We stand for a single Internet where all of humanity has equal access to knowledge and ideas (Ryan & Halper, 2010).” That same month, Google decided it would stop censoring search results on Google.cn, its Chinese-language search engine, as a result of a serious cyber-attack on the company originating in China. In addition, Google stated that if an agreement with the Chinese government could not be reached over the censorship of search results, Google would pull out of China completely. Because Google has complied (albeit uneasily) with the Chinese government in the past, this change in policy was a major reversal.
Withdrawing from one of the largest expanding markets in the world is shocking coming from a company that has been aggressively expanding into foreign markets. This move highlights the fundamental tension between China’s censorship policy and Google’s core values. Google’s company motto, “Don’t be evil,” had long been at odds with its decision to censor search results in China. Google’s compliance with the Chinese government did not help it make inroads into the Chinese Internet search market—although Google held about a quarter of the market in China, most of the search traffic went to the tightly controlled Chinese search engine Baidu. However, Google’s departure from China would be a blow to antigovernment forces in the country. Since Baidu has a closer relationship with the Chinese government, political dissidents tend to use Google’s Gmail, which uses encrypted servers based in the United States. Google’s threat to withdraw from China raises the possibility that globalization could indeed hit roadblocks due to the ways that foreign governments may choose to censor the Internet.
New Media: Internet Convergence and American Society
One only needs to go to CNN’s official Twitter feed and begin to click random faces in the “Following” column to see the effect of media convergence through the Internet. Hundreds of different options abound, many of them individual journalists’ Twitter feeds, and many of those following other journalists. Considering CNN’s motto, “The most trusted name in network news,” its presence on Twitter might seem at odds with providing in-depth, reliable coverage. After all, how in-depth can 140 characters get?
The truth is that many of these traditional media outlets use Twitter not as a communication tool in itself, but as a way to allow viewers to aggregate a large amount of information they may have missed. Instead of visiting multiple home pages to see the day’s top stories from multiple viewpoints, Twitter users only have to check their own Twitter pages to get updates from all the organizations they “follow.” Media conglomerates then use Twitter as part of an overall integration of media outlets; the Twitter feed is there to support the news content, not to report the content itself.
Internet-Only Sources
The threshold was crossed in 2008: The Internet overtook print media as a primary source of information for national and international news in the U.S. Television is still far in the lead, but especially among younger demographics, the Internet is quickly catching up as a way to learn about the day’s news. With 40 percent of the public receiving their news from the Internet (see Figure 11.8) (Pew Research Center for the People, 2008), media outlets have been scrambling to set up large presences on the web. Yet one of the most remarkable shifts has been in the establishment of online-only news sources.
The conventional argument claims that the anonymity and the echo chamber of the Internet undermine worthwhile news reporting, especially for topics that are expensive to report on. The ability of large news organizations to put reporters in the field is one of their most important contributions and (because of its cost) is often one of the first things to be cut back during times of budget problems. However, as the Internet has become a primary news source for more and more people, new media outlets—publications existing entirely online—have begun to appear.
In 2006, two reporters for the Washington Post, John F. Harris and Jim VandeHei, left the newspaper to start a politically centered website called Politico. Rather than simply repeating the day’s news in a blog, they were determined to start a journalistically viable news organization on the web. Four years later, the site has over 6,000,000 unique monthly visitors and about a hundred staff members, and there is now a Politico reporter on almost every White House trip (Wolff, 2009).
Far from being a collection of amateurs trying to make it big on the Internet, Politico’s senior White House correspondent is Mike Allen, who previously wrote for The New York Times, Washington Post, and Time. His daily Playbook column appears at around 7 a.m. each morning and is read by much of the politically centered media. The different ways that Politico reaches out to its supporters—blogs, Twitter feeds, regular news articles, and now even a print edition—show how media convergence has even occurred within the Internet itself. The interactive nature of its services and the active comment boards on the site also show how the media have become a two-way street: more of a public forum than a straight news service.
“Live” From New York …
Top-notch political content is not the only medium moving to the Internet, however. Saturday Night Live (SNL) has built an entire entertainment model around its broadcast time slot. Every weekend, around 11:40 p.m. on Saturday, someone interrupts a skit, turns toward the camera, shouts “Live from New York, it’s Saturday Night!” and the band starts playing. Yet the show’s sketch comedy style also seems to lend itself to the watch-anytime convenience of the Internet. In fact, the online TV service Hulu carries a full eight episodes of SNL at any given time, with regular 3.5-minute commercial breaks replaced by Hulu-specific minute-long advertisements. The time listed for an SNL episode on Hulu is just over an hour—a full half-hour less than the time it takes to watch it live on Saturday night.
Hulu calls its product “online premium video,” primarily because of its desire to attract not the YouTube amateur, but rather a partnership of large media organizations. Although many networks, like NBC and Comedy Central, stream video on their websites, Hulu builds its business by offering a legal way to see all these shows on the same site; a user can switch from South Park to SNL with a single click, rather than having to move to a different website.
Premium Online Video Content
Hulu’s success points to a high demand among Internet users for a wide variety of content collected and packaged in one easy-to-use interface. Hulu was rated the Website of the Year by the Associated Press (Coyle, 2008) and even received an Emmy nomination for a commercial featuring Alec Baldwin and Tina Fey, the stars of the NBC comedy 30 Rock (Neil, 2009). Hulu’s success has not been the product of the usual dot-com underdog startup, however. Its two parent companies, News Corporation and NBC Universal, are two of the world’s media giants. In many ways, this was a logical step for these companies to take after fighting online video for so long. In December 2005, the video “Lazy Sunday,” an SNL digital short featuring Andy Samberg and Chris Parnell, went viral with over 5,000,000 views on YouTube before February 2006, when NBC demanded that YouTube take down the video (Biggs, 2006). NBC later posted the video on Hulu, where it could sell advertising for it.
Hulu allows users to break out of programming models controlled by broadcast and cable TV providers and choose freely what shows to watch and when to watch them. This seems to work especially well for cult programs that are no longer available on TV. In 2008, the show Arrested Development, which was canceled in 2006 after repeated time slot shifts, was Hulu’s second-most-popular program.
Hulu certainly seems to have leveled the playing field for some shows that have had difficulty finding an audience through traditional means. 30 Rock, much like Arrested Development, suffered from a lack of viewers in its early years. In 2008, New York Magazine described the show as a “fragile suckling that critics coddle but that America never quite warms up to (Sternbergh, 2008).” However, even as 30 Rock shifted time slots mid-season, its viewer base continued to grow through the NBC partner of Hulu. The nontraditional media approach of NBC’s programming culminated in October 2008, when NBC decided to launch the new season of 30 Rock on Hulu a full week before it was broadcast over the airwaves (Wortham, 2008). Hulu’s strategy of providing premium online content seems to have paid off: As of March 2011, Hulu provided 143,673,000 viewing sessions to more than 27 million unique visitors, according to Nielsen (ComScore, 2011).
Unlike other “premium” services, Hulu does not charge for its content; rather, the word premium in its slogan seems to imply that it could charge for content if it wanted to. Other platforms, like Sony’s PlayStation 3, block Hulu for this very reason—Sony’s online store sells the products that Hulu gives away for free. However, Hulu has been considering moving to a paid subscription model that would allow users to access its entire back catalog of shows. Like many other fledgling web enterprises, Hulu seeks to create reliable revenue streams to avoid the fate of many of the companies that folded during the dot-com crash (Sandoval, 2009).
Like Politico, Hulu has packaged professionally produced content into an on-demand web service that can be used without the normal constraints of traditional media. Just as users can comment on Politico articles (and now, on most newspapers’ articles), they can rate Hulu videos, and Hulu will take this into account. Even when users do not produce the content themselves, they still want this same “two-way street” service.
The Role of the Internet in Social Alienation
In the early years, the Internet was stigmatized as a tool for introverts to avoid “real” social interactions, thereby increasing their alienation from society. Yet the Internet was also seen as the potentially great connecting force between cultures all over the world. The idea that something that allowed communication across the globe could breed social alienation seemed counterintuitive. The American Psychological Association (APA) coined this concept the “Internet paradox.”
Studies like the APA’s “Internet paradox: A social technology that reduces social involvement and psychological well-being (Kraut, et. al., 1998)?” which came out in 1998, suggested that teens who spent lots of time on the Internet showed much greater rates of self-reported loneliness and other signs of psychological distress. Even though the Internet had been around for a while by 1998, the increasing concern among parents was that teenagers were spending all their time in chat rooms and online. The fact was that teenagers spent much more time on the Internet than adults, due to their increased free time, curiosity, and familiarity with technology.
However, this did not necessarily mean that “kids these days” were antisocial or that the Internet caused depression and loneliness. In his critical analysis “Deconstructing the Internet Paradox,” computer scientist, writer, and PhD recipient from Carnegie Mellon University Joseph M. Newcomer points out that the APA study did not include a control group to adjust for what may be normal “lonely” feelings in teenagers. Again, he suggests that “involvement in any new, self-absorbing activity which has opportunity for failure can increase depression,” seeing Internet use as just another time-consuming hobby, much like learning a musical instrument or playing chess (Newcomer, 2000).
The general concept that teenagers were spending all their time in chat rooms and online forums instead of hanging out with flesh-and-blood friends was not especially new; the same thing had generally been thought of the computer hobbyists who pioneered the esoteric Usenet. However, the concerns were amplified when a wider range of young people began using the Internet, and the trend was especially strong in the younger demographics.
The “Internet Paradox” and Facebook
As they developed, it became quickly apparent that the Internet generation did not suffer from perpetual loneliness as a rule. After all, the generation that was raised on instant messaging invented Facebook and still makes up most of Facebook’s audience. As detailed earlier in the chapter, Facebook began as a service limited to college students—a requirement that practically excluded older participants. As a social tool and as a reflection of the way younger people now connect with each other over the Internet, Facebook has provided a comprehensive model for the Internet’s effect on social skills and especially on education.
A study by the Michigan State University Department of Telecommunication, Information Studies, and Media has shown that college-age Facebook users connect with offline friends twice as often as they connect with purely online “friends (Ellison, et. al., 2007).” In fact, 90 percent of the participants in the study reported that high school friends, classmates, and other friends were the top three groups that their Facebook profiles were directed toward.
In 2007, when this study took place, one of Facebook’s most remarkable tools for studying the ways that young people connect was its “networks” feature. Originally, a Facebook user’s network consisted of all the people at his or her college e-mail domain: the “mycollege” portion of “me@mycollege.edu.” The MSU study, performed in April 2006, just 6 months after Facebook opened its doors to high school students, found that first-year students met new people on Facebook 36 percent more often than seniors did. These freshmen, in April 2006, were not as active on Facebook as high schoolers (Facebook began allowing high schoolers on its site during these students’ first semester in school) (Rosen, 2005). The study concluded that they could “definitively state that there is a positive relationship between certain kinds of Facebook use and the maintenance and creation of social capital (Ellison, et. al., 2007).” In other words, even though the study cannot show whether Facebook use causes or results from social connections, it can say that Facebook plays both an important and a nondestructive role in the forming of social bonds.
Although this study provides a complete and balanced picture of the role that Facebook played for college students in early 2006, there have been many changes in Facebook’s design and in its popularity. In 2006, many of a user’s “friends” were from the same college, and the whole college network might be mapped as a “friend-of-a-friend” web. If users allowed all people within a single network access to their profiles, it would create a voluntary school-wide directory of students. Since a university e-mail address was required for signup, there was a certain level of trust. The results of this Facebook study, still relatively current in terms of showing the Internet’s effects on social capital, show that not only do social networking tools not lead to more isolation, but that they actually have become integral to some types of networking.
However, as Facebook began to grow and as high school and regional networks (such as “New York City” or “Ireland”) were incorporated, users’ networks of friends grew exponentially, and the networking feature became increasingly unwieldy for privacy purposes. In 2009, Facebook discontinued regional networks over concerns that networks consisting of millions of people were “no longer the best way for you to control your privacy (Zuckerberg, 2009).” Where privacy controls once consisted of allowing everyone at one’s college access to specific information, Facebook now allows only three levels: friends, friends of friends, and everyone.
Meetup.com: Meeting Up “IRL”
Of course, not everyone on teenagers’ online friends lists are actually their friends outside of the virtual world. In the parlance of the early days of the Internet, meeting up “IRL” (shorthand for “in real life”) was one of the main reasons that many people got online. This practice was often looked at with suspicion by those not familiar with it, especially because of the anonymity of the Internet. The fear among many was that children would go into chat rooms and agree to meet up in person with a total stranger, and that stranger would turn out to have less-than-friendly motives. This fear led to law enforcement officers posing as underage girls in chat rooms, agreeing to meet for sex with older men (after the men brought up the topic—the other way around could be considered entrapment), and then arresting the men at the agreed-upon meeting spot.
In recent years, however, the Internet has become a hub of activity for all sorts of people. In 2002, Scott Heiferman started Meetup.com based on the “simple idea of using the Internet to get people off the Internet (Heiferman, 2009).” The entire purpose of Meetup.com is not to foster global interaction and collaboration (as is the purpose of something like Usenet,) but rather to allow people to organize locally. There are Meetups for politics (popular during Barack Obama’s presidential campaign), for New Yorkers who own Boston terriers (Fairbanks, 2008), for vegan cooking, for board games, and for practically everything else. Essentially, the service (which charges a small fee to Meetup organizers) separates itself from other social networking sites by encouraging real-life interaction. Whereas a member of a Facebook group may never see or interact with fellow members, Meetup.com actually keeps track of the (self-reported) real-life activity of its groups—ideally, groups with more activity are more desirable to join. However much time these groups spend together on or off the Internet, one group of people undoubtedly has the upper hand when it comes to online interaction: World of Warcraft players.
World of Warcraft: Social Interaction Through Avatars
A writer for Time states the reasons for the massive popularity of online role-playing games quite well: “[My generation’s] assumptions were based on the idea that video games would never grow up. But no genre has worked harder to disprove that maxim than MMORPGs—Massively Multiplayer Online Games (Coates, 2007).” World of Warcraft (WoW, for short) is the most popular MMORPG of all time, with over 11 million subscriptions and counting. The game is inherently social; players must complete “quests” in order to advance in the game, and many of the quests are significantly easier with multiple people. Players often form small, four-to five-person groups in the beginning of the game, but by the end of the game these larger groups (called “raiding parties”) can reach up to 40 players.
In addition, WoW provides a highly developed social networking feature called “guilds.” Players create or join a guild, which they can then use to band with other guilds in order to complete some of the toughest quests. “But once you’ve got a posse, the social dynamic just makes the game more addictive and time-consuming,” writes Clive Thompson for Slate (Thompson, 2005). Although these guilds do occasionally meet up in real life, most of their time together is spent online for hours per day (which amounts to quite a bit of time together), and some of the guild leaders profess to seeing real-life improvements. Joi Ito, an Internet business and investment guru, joined WoW long after he had worked with some of the most successful Internet companies; he says he “definitely (Pinckard, 2006)” learned new lessons about leadership from playing the game. Writer Jane Pinckard, for video game blog 1UP, lists some of Ito’s favorite activities as “looking after newbs [lower-level players] and pleasing the veterans,” which he calls a “delicate balancing act (Pinckard, 2006),” even for an ex-CEO.
With over 12 million subscribers, WoW necessarily breaks the boundaries of previous MMORPGs. The social nature of the game has attracted unprecedented numbers of female players (although men still make up the vast majority of players), and its players cannot easily be pegged as antisocial video game addicts. On the contrary, they may even be called social video game players, judging from the general responses given by players as to why they enjoy the game. This type of play certainly points to a new way of online interaction that may continue to grow in coming years.
Social Interaction on the Internet Among Low-Income Groups
In 2006, the journal Developmental Psychology published a study looking at the educational benefits of the Internet for teenagers in low-income households. It found that “children who used the Internet more had higher grade point averages (GPA) after one year and higher scores after standardized tests of reading achievement after six months than did children who used it less,” and that continuing to use the Internet more as the study went on led to an even greater increase in GPA and standardized test scores in reading (there was no change in mathematics test scores) (Jackson, et. al., 2006).
One of the most interesting aspects of the study’s results is the suggestion that the academic benefits may exclude low-performing children in low-income households. The reason for this, the study suggests, is that children in low-income households likely have a social circle consisting of other children from low-income households who are also unlikely to be connected to the Internet. As a result, after 16 months of Internet usage, only 16 percent of the participants were using e-mail and only 25 percent were using instant messaging services. Another reason researchers suggested was that because “African-American culture is historically an ‘oral culture,’” and 83 percent of the participants were African American, the “impersonal nature of the Internet’s typical communication tools” may have led participants to continue to prefer face-to-face contact. In other words, social interaction on the Internet can only happen if your friends are also on the Internet.
The Way Forward: Communication, Convergence, and Corporations
On February 15, 2010, the firm Compete, which analyzes Internet traffic, reported that Facebook surpassed Google as the No. 1 site to drive traffic toward news and entertainment media on both Yahoo! and MSN (Ingram, 2010). This statistic is a strong indicator that social networks are quickly becoming one of the most effective ways for people to sift through the ever-increasing amount of information on the Internet. It also suggests that people are content to get their news the way they did before the Internet or most other forms of mass media were invented—by word of mouth.
Many companies now use the Internet to leverage word-of-mouth social networking. The expansion of corporations into Facebook has given the service a big publicity boost, which has no doubt contributed to the growth of its user base, which in turn helps the corporations that put marketing efforts into the service. Putting a corporation on Facebook is not without risk; any corporation posting on Facebook runs the risk of being commented on by over 500 million users, and of course there is no way to ensure that those users will say positive things about the corporation. Good or bad, communicating with corporations is now a two-way street.
Key Takeaways
- The Internet has made pop culture transmission a two-way street. The power to influence popular culture no longer lies with the relative few with control over traditional forms of mass media; it is now available to the great mass of people with access to the Internet. As a result, the cross-fertilization of pop culture from around the world has become a commonplace occurrence.
- The Internet’s key difference from traditional media is that it does not operate on a set intervallic time schedule. It is not “periodical” in the sense that it comes out in daily or weekly editions; it is always updated. As a result, many journalists file both “regular” news stories and blog posts that may be updated and that can come at varied intervals as necessary. This allows them to stay up-to-date with breaking news without necessarily sacrificing the next day’s more in-depth story.
- The “Internet paradox” is the hypothesis that although the Internet is a tool for communication, many teenagers who use the Internet lack social interaction and become antisocial and depressed. It has been largely disproved, especially since the Internet has grown so drastically. Many sites, such as Meetup.com or even Facebook, work to allow users to organize for offline events. Other services, like the video game World of Warcraft, serve as an alternate social world.
Exercises
- Make a list of ways you interact with friends, either in person or on the Internet. Are there particular methods of communication that only exist in person?
- Are there methods that exist on the Internet that would be much more difficult to replicate in person?
- How do these disprove the “Internet paradox” and contribute to the globalization of culture?
- Pick a method of in-person communication and a method of Internet communication, and compare and contrast these using a Venn diagram.
References
Arango, Tim. “World Falls for American Media, Even as It Sours on America,” New York Times, November 30, 2008, http://www.nytimes.com/2008/12/01/business/media/01soft.html.
Biggs, John. “A Video Clip Goes Viral, and a TV Network Wants to Control It,” New York Times, February 20, 2006, http://www.nytimes.com/2006/02/20/business/media/20youtube.html.
Coates, Ta-Nehisi Paul. “Confessions of a 30-Year-Old Gamer,” Time, January 12, 2007, http://www.time.com/time/arts/article/0,8599,1577502,00.html.
ComScore, “ComScore release March 2011 US Online Video Rankings,” April 12, 2011, http://www.comscore.com/Press_Events/Press_Releases/2011/4/
comScore_Releases_March_2011_U.S._Online_Video_Rankings.
Coyle, Jake. “On the Net: Hulu Is Web Site of the Year,” Seattle Times, December 19, 2008, http://seattletimes.nwsource.com/html/entertainment/2008539776_aponthenetsiteoftheyear.html.
Ellison, Nicole B. Charles Steinfield, and Cliff Lampe, “The Benefits of Facebook ‘Friends’: Social Capital and College Students’ Use of Online Social Network Sites,” Journal of Computer-Mediated Communication 14, no. 4 (2007).
Ellison, Nicole B. Charles Steinfield, and Cliff Lampe, “The Benefits of Facebook ‘Friends’: Social Capital and College Students’ Use of Online Social Network Sites,” Journal of Computer-Mediated Communication 14, no. 4 (2007).
Fairbanks, Amanda M. “Funny Thing Happened at the Dog Run,” New York Times, August 23, 2008, csehttp://www.nytimes.com/2008/08/24/nyregion/24meetup.html.
Google, “Internet users as percentage of population: China,” February 19, 2010, http://www.google.com/publicdata?ds=wb-wdi&met=it_net_user_p2&idim=country: CHN&dl=en&hl=en&q=china+internet+users.
Heiferman, Scott. “The Pursuit of Community,” New York Times, September 5, 2009, csehttp://www.nytimes.com/2009/09/06/jobs/06boss.html.
Ingram, Mathew. “Facebook Driving More Traffic Than Google,” New York Times, February 15, 2010, http://www.nytimes.com/external/gigaom/2010/02/15/15gigaom-facebook-driving-more-traffic-than-google-42970.html.
Jackson, Linda A. and others, “Does Home Internet Use Influence the Academic Performance of Low-Income Children?” Developmental Psychology 42, no. 3 (2006): 433–434.
Kraut, Robert and others, “Internet Paradox: A Social Technology That Reduces Social Involvement and Psychological Well-Being?” American Psychologist, September 1998, http://psycnet.apa.org/index.cfm?fa=buy.optionToBuy&id=1998-10886-001.
Lavin, Frank. “‘Globalization and Culture’: Remarks by Ambassador Frank Lavin at the Eisenhower Fellowship Conference in Singapore,” U.S. Embassy in Singapore, June 28, 2005, http://singapore.usembassy.gov/062805.html.
Neil, Dan. “‘30 Rock’ Gets a Wink and a Nod From Two Emmy-Nominated Spots,” Los Angeles Times, July 21, 2009, http://articles.latimes.com/2009/jul/21/business/fi-ct-neil21.
Newcomer, Joseph M. “Deconstructing the Internet Paradox,” Ubiquity, Association for Computing Machinery, April 2000, http://ubiquity.acm.org/article.cfm?id=334533. (Originally published as an op-ed in the Pittsburgh Post-Gazette, September 27, 1998.).
Pew Research Center for the People & the Press, “Internet Overtakes Newspapers as News Outlet,” December 23, 2008, http://people-press.org/report/479/internet-overtakes-newspapers-as-news-source.
Pinckard, Jane. “Is World of Warcraft the New Golf?” 1UP.com, February 8, 2006, http://www.1up.com/news/world-warcraft-golf.
Rosen, Ellen. “THE INTERNET; Facebook.com Goes to High School,” New York Times, October 16, 2005, http://query.nytimes.com/gst/fullpage.html?res=9C05EEDA173FF935A25753C1A9639C8B63&scp=5&sq=facebook&st=nyt.
Ryan, Johnny and Stefan Halper, “Google vs China: Capitalist Model, Virtual Wall,” OpenDemocracy, January 22, 2010, http://www.opendemocracy.net/johnny-ryan-stefan-halper/google-vs-china-capitalist-model-virtual-wall.
Sandoval, Greg. “More Signs Hulu Subscription Service Is Coming,” CNET, October 22, 2009, http://news.cnet.com/8301-31001_3-10381622-261.html.
Sternbergh, Adam. “‘The Office’ vs. ‘30 Rock’: Comedy Goes Back to Work,” New York Magazine, April 10, 2008, http://nymag.com/daily/entertainment/2008/04/the_office_vs_30_rock_comedy_g.html.
Thompson, Clive. “An Elf’s Progress: Finally, Online Role-Playing Games That Won’t Destroy Your Life,” Slate, March 7, 2005, http://www.slate.com/id/2114354.
Wolff, Michael. “Politico’s Washington Coup,” Vanity Fair, August 2009, http://www.vanityfair.com/politics/features/2009/08/wolff200908.
Wortham, Jenna. “Hulu Airs Season Premiere of 30 Rock a Week Early,” Wired, October 23, 2008, http://www.wired.com/underwire/2008/10/hulu-airs-seaso/.
Zuckerberg, Mark. “An Open Letter from Facebook Founder Mark Zuckerberg,” Facebook, December 1, 2009, http://blog.facebook.com/blog.php?post=190423927130.
11.5 Issues and Trends
Learning Objectives
- Define information superhighway as it relates to the Internet.
- Identify ways to identify credible sources online.
- Define net neutrality.
- Describe some of the effects of the Internet and social media on traditional media.
By 1994, the promise of the “information superhighway” had become so potent that it was given its own summit on the University of California Los Angeles campus. The country was quickly realizing that the spread of the web could be harnessed for educational purposes; more than just the diversion of computer hobbyists, this new vision of the web would be a constant learning resource that anyone could use.
The American video artist pioneer Nam June Paik takes credit for the term information superhighway, which he used during a study for the Rockefeller Foundation in 1974, long before the existence of Usenet. In 2001, he said, “If you create a highway, then people are going to invent cars. That’s dialectics. If you create electronic highways, something has to happen (The Biz Media, 2010).” Paik’s prediction proved to be startlingly prescient.
Al Gore’s use of the term in the House of Representatives (and later as vice president) had a slightly different meaning and context. To Gore, the promise of the Interstate Highway System during the Eisenhower era was that the government would work to allow communication across natural barriers, and that citizens could then utilize these channels to conduct business and communicate with one another. Gore saw the government as playing an essential role in maintaining the pathways of electronic communication. Allowing business interests to get involved would compromise what he saw as a necessarily neutral purpose; a freeway doesn’t judge or demand tolls—it is a public service—and neither should the Internet. During his 2000 presidential campaign, Gore was wrongly ridiculed for supposedly saying that he “invented the Internet,” but in reality his work in the House of Representatives played a crucial part in developing the infrastructure required for Internet access.
However, a certain amount of money was necessary to get connected to the web. In this respect, AOL was like the Model T of the Internet—it put access to the information superhighway within reach of the average person. But despite the affordability of AOL and the services that succeeded it, certain demographics continued to go without access to the Internet, a problem known as the “digital divide,” which you will learn more about in this section.
From speed of transportation, to credibility of information (don’t trust the stranger at the roadside diner), to security of information (keep the car doors locked), to net neutrality (toll-free roads), to the possibility of piracy, the metaphor of the information superhighway has proved to be remarkably apt. All of these issues have played out in different ways, both positive and negative, and they continue to develop to this day.
Information Access Like Never Before
In December 2002, a survey by the Pew Internet & American Life Project found that 84 percent of Americans believed that they could find information on health care, government, news, or shopping on the Internet (Jesdanun, 2002). This belief in a decade-old system of interconnected web pages would in itself be remarkable, but taking into account that 37 percent of respondents were not even connected to the Internet, it becomes even more fantastic. In other words, of the percentage of Americans without Internet connections, 64 percent still believed that it could be a source of information about these crucial topics. In addition, of those who expect to find such information, at least 70 percent of them succeed; news and shopping were the most successful topics, government was the least. This survey shows that most Americans believed that the Internet was indeed an effective source of information. Again, the role of the Internet in education was heralded as a new future, and technology was seen to level the playing field for all students.
Nowhere was this more apparent than in the Bush administration’s 2004 report, “Toward a New Golden Age in Education: How the Internet, the Law, and Today’s Students Are Revolutionizing Expectations.” By this time, the term digital divide was already widely used and the goal of “bridging” it took everything from putting computers in classrooms to giving personal computers to some high-need students to use at home.
The report stated that an “explosive growth” in sectors such as e-learning and virtual schools allowed each student “individual online instruction (U.S. Department of Education, 2004).” More than just being able to find information online, people expected the Internet to provide virtually unlimited access to educational opportunities. To make this expectation a reality, one of the main investments that the paper called for was increased broadband Internet access. As Nam June Paik predicted, stringing fiber optics around the world would allow for seamless video communication, a development that the Department of Education saw as integral to its vision of educating through technology. The report called for broadband access “24 hours a day, seven days a week, 365 days a year,” saying that it could “help teachers and students realize the full potential of this technology (U.S. Department of Education, 2004).”
Rural Areas and Access to Information
One of the founding principles of many public library systems is to allow for free and open access to information. Historically, one of the major roadblocks to achieving this goal has been a simple one: location. Those living in rural areas or those with limited access to transportation simply could not get to a library. But with the spread of the Internet, the hope was that a global library would be created—an essential prospect for rural areas.
One of the most remarkable educational success stories in the Department of Education’s study is that of the Chugach School District in Alaska. In 1994, this district was the lowest performing in the state: over 50 percent staff turnover, the lowest standardized test scores, and only one student in 26 years graduating from college (U.S. Department of Education, 2004). The school board instituted drastic measures, amounting to a complete overhaul of the system. They abolished grade levels, focusing instead on achievement, and by 2001 had increased Internet usage from 5 percent to 93 percent.
The Department of Education study emphasizes these numbers, and with good reason: The standardized test percentile scores rose from the 1920s to the 1970s in a period of 4 years, in both math and language arts. Yet these advances were not exclusive to low-performing rural students. In Florida, the Florida Virtual School system allowed rural school districts to offer advanced-placement coursework. Students excelling in rural areas could now study topics that were previously limited to districts that could fill (and fund) an entire classroom. Just as the Interstate Highway System commercially connected the most remote rural communities to large cities, the Internet has brought rural areas even further into the global world, especially in regard to the sharing of information and knowledge.
The Cloud: Instant Updates, Instant Access
As technology has improved, it has become possible to provide software to users as a service that resides entirely online, rather than on a person’s personal computer. Since people can now be connected to the Internet constantly, they can use online programs to do all of their computing. It is no longer absolutely necessary to have, for example, a program like Microsoft Word to compose documents; this can be done through an online service like Google Docs or Zoho Writer.
“Cloud computing” is the process of outsourcing common computing tasks to a remote server. The actual work is not done by the computer attached to the user’s monitor, but by other (maybe many other) computers in the “cloud.” As a result, the computer itself does not actually need that much processing power; instead of calculating “1 + 1 = 2,” the user’s computer asks the cloud, “What does 1 + 1 equal?” and receives the answer. Meanwhile, the system resources that a computer would normally devote to completing these tasks are freed up to be used for other things. An additional advantage of cloud computing is that data can be stored in the cloud and retrieved from any computer, making a user’s files more conveniently portable and less vulnerable to hardware failures like a hard drive crash. Of course, it can require quite a bit of bandwidth to send these messages back and forth to a remote server in the cloud, and in the absence of a reliable, always-on Internet connection, the usefulness of these services can be somewhat limited.
The concept of the cloud takes into account all the applications that are hosted on external machines and viewed on a user’s computer. Google Docs, which provides word processors, spreadsheets, and other tools, and Microsoft’s Hotmail, which provides e-mail access, both constitute aspects of the “cloud.” These services are becoming even more popular with the onset of mobile applications and netbooks, which are small laptops with relatively little processing power and storage space that rely on cloud computing. A netbook does not need the processing power required to run Microsoft Word; as long as it has a web browser, it can run the Google Docs word processor and leave (almost) all of the processing to the cloud. Because of this evolution of the Internet, computers can be built less like stand-alone machines and more like interfaces for interacting with the larger system in the cloud.
One result of cloud computing has been the rise in web applications for mobile devices, such as the iPhone, BlackBerry, and devices that use Google’s Android operating system. 3G networks, which are cell phone networks capable of high-speed data transfer, can augment the computing power of phones just by giving the phones the ability to send data somewhere else to be processed. For example, a Google Maps application does not actually calculate the shortest route between two places (taking into account how highways are quicker than side roads, and numerous other computational difficulties) but rather just asks Google to do the calculation and send over the result. 3G networks have made this possible in large part because the speed of data transfer has now surpassed the speed of cell phones’ calculation abilities. As cellular transmission technology continues to improve with the rollout of the next-generation 4G networks (the successors to 3G networks), connectivity speeds will further increase and allow for a focus on ever-more-comprehensive provisions for multimedia.
Credibility Issues: (Dis)information Superhighway?
The Internet has undoubtedly been a boon for researchers and writers everywhere. Online services range from up-to-date news and media to vast archives of past writing and scholarship. However, since the Internet is open to any user, anyone with a few dollars can set up a credible-sounding website and begin to disseminate false information.
This is not necessarily a problem with the Internet specifically; any traditional medium can—knowingly or unknowingly—publish unreliable or outright false information. But the explosion of available sources on the Internet has caused a bit of a dilemma for information seekers. The difference is that much of the information on the Internet is not the work of professional authors, but of amateurs who have questionable expertise. On the Internet, anyone can self-publish, so the vetting that usually occurs in a traditional medium—for example, by a magazine’s editorial department—rarely happens online.
That said, if an author who is recognizable from elsewhere writes something online, it may point to more reliable information (Kirk, 1996). In addition, looking for a trusted name on the website could lead to more assurance of reliability. For example, the site krugmanonline.com, the official site of Princeton economist Paul Krugman, does not have any authorial data. Even statements like “Nobel Prize Winner and Op-Ed Columnist for the New York Times” do not actually say anything about the author of the website. Much of the content is aggregated from the web as well. However, the bottom-left corner of the page has the mark “© 2009 W. W. Norton & Company, Inc.” (Krugman’s publisher). Therefore, a visitor might decide to pick and choose which information to trust. The author is clearly concerned with selling Krugman’s books, so the glowing reviews may need to be verified elsewhere; on the other hand, the author biography is probably fairly accurate, since the publishing company has direct access to Krugman, and Krugman himself probably looked it over to make sure it was valid. Taking the authorship of a site into account is a necessary step when judging information; more than just hunting down untrue statements, it can give insight into subtle bias that may arise and point to further research that needs to be done.
Just Trust Me: Bias on the web
One noticeable thing on Paul Krugman’s site is that all of his book reviews are positive. Although these are probably real reviews, they may not be representative of his critical reception at large. Mainstream journalistic sources usually attempt to achieve some sort of balance in their reporting; given reasonable access, they will interview opposing viewpoints and reserve judgment for the editorial page. Corporate sources, like on Krugman’s site, will instead tilt the information toward their product.
Often, the web is viewed as a source of entertainment, even in its informational capacity. Because of this, sites that rely on advertising may choose to publish something more inflammatory that will be linked to and forwarded more for its entertainment value than for its informational qualities.
On the other hand, a website might attempt to present itself as a credible source of information about a particular product or topic, with the end goal of selling something. A website that gives advice on how to protect against bedbugs that includes a direct link to its product may not be the best source of information on the topic. While so much on the web is free, it is worthwhile looking into how websites actually maintain their services. If a website is giving something away for free, the information might be biased, because it must be getting its money from somewhere. The online archive of Consumer Reports requires a subscription to access it. Ostensibly, this subscription revenue allows the service to exist as an impartial judge, serving the users rather than the advertisers.
Occasionally, corporations may set up “credible” fronts to disseminate information. Because sources may look reliable, it is always important to investigate further. Global warming is a contentious topic, and websites about the issue often represent the bias of their owners. For example, the Cato Institute publishes anti-global-warming theory columns in many newspapers, including well-respected ones such as the Washington Times. Patrick Basham, an adjunct scholar at the Cato Institute, published the article “Live Earth’s Inconvenient Truths” in the Washington Times on July 11, 2007. Basham writes, “Using normal scientific standards, there is no proof we are causing the Earth to warm, let alone that such warming will cause an environmental catastrophe (Basham, 2007).”
However, the website ExxposeExxon.com states that the Cato Institute received $125,000 from the oil giant ExxonMobil, possibly tainting its data with bias (Exxon, 2006). In addition, ExxposeExxon.com is run as a side project of the international environmental nonprofit Greenpeace, which may have its own reasons for producing this particular report. The document available on Greenpeace’s site (a scanned version of Exxon’s printout) states that in 2006, the corporation gave $20,000 to the Cato Institute (Greenpeace, 2007) (the other $105,000 was given over the previous decade).
This back-and-forth highlights the difficulty of finding credible information online, especially when money is at stake. In addition, it shows how conflicting sources may go to great lengths—sorting through a company’s corporate financial reports—in order to expose what they see as falsehoods. What is the upside to all of this required fact-checking and cross-examination? Before the Internet, this probably would have required multiple telephone calls and plenty of time waiting on hold. While the Internet has made false information more widely available, it has also made checking that information incredibly easy.
Wikipedia: The Internet’s Precocious Problem Child
Nowhere has this cross-examination and cross-listing of sources been more widespread than with Wikipedia. Information free and available to all? That sounds like a dream come true—a dream that Wikipedia founder Jimmy Wales was ready to pursue. Since the site began in 2001, the Wikimedia Foundation (which hosts all of the Wikipedia pages) has become the sixth-most-visited site on the web, barely behind eBay in terms of its unique page views.
Organizations had long been trying to develop factual content for the web but Wikipedia went for something else: verifiability. The guidelines for editing Wikipedia state: “What counts is whether readers can verify that material added to Wikipedia has already been published by a reliable source, not whether editors think it is true (Wikipedia).” The benchmark for inclusion on Wikipedia includes outside citations for any content “likely to be challenged” and for “all quotations.”
While this may seem like it’s a step ahead of many other sources on the Internet, there is a catch: Anyone can edit Wikipedia. This has a positive and negative side—though anyone can vandalize the site, anyone can also fix it. In addition, calling a particularly contentious page to attention can result in one of the site’s administrators placing a warning at the top of the page stating that the information is not necessarily verified. Other warnings include notices on articles about living persons, which are given special attention, and articles that may violate Wikipedia’s neutrality policy. This neutrality policy is a way to mitigate the extreme views that may be posted on a page with open access, allowing the community to decide what constitutes a “significant” view that should be represented (Wikipedia).
As long as users do not take the facts on Wikipedia at face value and make sure to follow up on the relevant sources linked in the articles they read, the site is an extremely useful reference tool that gives users quick access to a wide range of subjects. However, articles on esoteric subjects can be especially prone to vandalism or poorly researched information. Since every reader is a potential editor, a lack of readers can lead to a poorly edited page because errors, whether deliberate or not, go uncorrected. In short, the lack of authorial credit can lead to problems with judging bias and relevance of information, so the same precautions must be taken with Wikipedia as with any other online source, primarily in checking references. The advantage of Wikipedia is its openness and freedom—if you find a problem, you can either fix it (with your own verifiable sources) or flag it on the message boards. Culturally, there has been a shift from valuing a few reliable sources to valuing a multiplicity of competing sources. However, weighing these sources against one another has become easier than ever before.
Security of Information on the Internet
As the Internet has grown in scope and the amount of personal information online has proliferated, securing this information has become a major issue. The Internet now houses everything from online banking systems to highly personal e-mail messages, and even though security is constantly improving, this information is not invulnerable.
An example of this vulnerability is the Climategate scandal in late 2009. A collection of private e-mail messages were hacked from a server at the University of East Anglia, where much of the Intergovernmental Panel on Climate Change research takes place. These e-mails show internal debates among the scientists regarding which pieces of data should be released and which are not relevant (or helpful) to their case (Revkin, 2009). In these e-mails, the scientists sometimes talk about colleagues—especially those skeptical of climate change—in a derisive way. Of course, these e-mails were never meant to become public.
This scandal demonstrates how easy it can be to lose control of private information on the Internet. In previous decades, hard copies of these letters would have to be found, and the theft could probably be traced back to a specific culprit. With the Internet, it is much more difficult to tell who is doing the snooping, especially if it is done on a public network. The same protocols that allow for open access and communication also allow for possible exploitation. Like the Interstate Highway System, the Internet is impartial to its users. In other words: If you’re going to ride, lock your doors.
Hacking E-mail: From LOVE-LETTER-FOR-YOU to Google in China
Another explosive scandal involving e-mail account hacking also occurred in late 2009, when Google’s Gmail service was hacked by IP addresses originating in China. Gmail was one of the primary services used by human rights activists due to its location in the United States and its extra encryption. To understand the magnitude of this, it is important to understand the history of e-mail hacking and the importance of physical server location and local laws.
In 2000, a computer virus was unleashed by a student in the Philippines that simply sent a message with the subject line “I Love You.” The e-mail had a file attached, called LOVE-LETTER-FOR-YOU.TXT.vbs. The suffix “.txt” is generally used for text files and was meant, in this case, as a distraction; the file’s real suffix was “.vbs,” which means that the file is a script. When run, this script ran and e-mailed itself across the user’s entire address book, before sending any available passwords to an e-mail address in the Philippines. One of the key aspects of this case, however, was a matter of simple jurisdiction: The student was not prosecuted, due to the lack of computer crime laws in the Philippines (Zetter, 1983).
The encryption that Gmail uses resulted in only two of the accounts being successfully hacked, and hackers were only able to see e-mail subject lines and timestamps—no message content was available (Zetter, 2010). Since the chaos that ensued after the “I Love You” virus, e-mail users and service providers have become more vigilant in their defensive techniques. However, the increased reliance on e-mail for daily communication makes it an attractive target for hackers. The development of cloud computing will likely lead to entirely new problems with Internet security; just as a highway brings two communities together, it can also cause these communities to share problems.
Can’t Wait: Denial of Service
Although many people increasingly rely on the Internet for communication and access to information, this reliance has come with a hefty price. Most critically, a simple exploit can cause massive roadblocks to Internet traffic, leading to disruptions in commerce, communication, and, as the military continues to rely on the Internet, national security.
Distributed denial-of-service (DDoS) attacks work like cloud computing, but in reverse. Instead of a single computer going out to retrieve data from many different sources, DDoS is a coordinated effort by many different computers to bring down (or overwhelm) a specific website. Essentially, any web server can only handle a certain amount of information at once. While the largest and most stable web servers can talk to a huge number of computers simultaneously, even these can be overwhelmed.
During a DDoS attack on government servers belonging to both the United States and South Korea in July 2009, many U.S. government sites were rendered unavailable to users in Asia for a short time (Gorman & Ramstad, 2009). Although this did not have a major effect on U.S. cyber-security, the ease with which these servers could be exploited was troubling. In this case, the DDoS attacks were perpetuated by an e-mail virus known as MyDoom, which essentially turned users’ computers into server-attacking “zombies.” This exploit—coupling an e-mail scam with a larger attack—is difficult to trace, partly because the culprit is likely not one of the original attackers, but rather the victim of a virus used to turn vulnerable computers into an automated hacker army. Since the attack, President Barack Obama has committed to creating a new post for a head of cyber-security in the government.
Net Neutrality
Most Internet users in the United States connect through a commercial Internet service provider (ISP). The major players—Comcast, Verizon, Time Warner Cable, AT&T, and others—are portals to the larger Internet, serving as a way for anyone with a cable line or phone line to receive broadband Internet access through a dedicated data line.
Ideally, ISPs treat all content impartially; any two websites will load at the same speed if they have adequate server capabilities. Service providers are not entirely happy with this arrangement. ISPs have proposed a new service model that would allow corporations to pay for a “higher tier” service. For example, this would allow AOL Time Warner to deliver its Hulu service (which Time Warner co-owns with NBC) faster than all other video services, leading to partnerships between Internet content providers and Internet service providers. The service providers also often foot the bill for expanding high-speed Internet access, and they see this new two-tiered service as a way to cash in on some of that investment (and, presumably, to reinvest the funds received).
The main fear—and the reason the FCC introduced net neutrality rules—is that such a service would hamper the ability of an Internet startup to grow its business. Defenders of net neutrality contend that small businesses (those without the ability to forge partnerships with the service providers) would be forced onto a “second-tier” Internet service, and their content would naturally suffer, decreasing inventiveness and competition among Internet content providers.
Net Neutrality Legislation: The FCC and AT&T
One of the key roadblocks to Internet legislation is the difficulty of describing the Internet and the Internet’s place among communication bills of the past. First of all, it is important to realize that legislation relating to the impartiality of service providers is not unheard-of. Before the 1960s, AT&T was allowed to restrict its customers to using only its own telephones on its networks. In the 1960s, the FCC launched a series of “Computer Inquiries,” stating, in effect, that any customer could use any device on the network, as long as it did not actually harm the network. This led to inventions such as the fax machine, which would not have been possible under AT&T’s previous agreement.
A key point today is that these proto–net neutrality rules protected innovation even when they “threatened to be a substitute for regulated services (Cannon, 2003).” This is directly relevant to a controversy involving Apple’s iPhone that culminated in October 2009 when AT&T agreed to allow VoIP (voice over Internet protocol) on its 3G data networks. VoIP services, like the program Skype, allow a user to place a telephone call from an Internet data line to a traditional telephone line. In the case of the iPhone, AT&T did not actually block the transmission of data—it just had Apple block the app from its App Store. Since AT&T runs the phone service as well as the data lines, and since many users have plans with unlimited data connections, AT&T could see its phone profits cut drastically if all its users suddenly switched to using Skype to place all their telephone calls.
Misleading Metaphors: It’s Not a Big Truck
Senator Ted Stevens, the former head of the committee in charge of regulating the Internet, said on the floor of the Senate that the Internet is “not a big truck…it’s a series of tubes (Curtis, 2006).” According to this metaphor, an e-mail can get “stuck in the tubes” for days behind someone else’s material, leading to poorer service for the customer. In reality, service providers sell data-usage plans that only set a cap on the amount of data that someone can send over the Internet (measured in bits per second, where a bit is the smallest measurement of data). If a service is rated at 1.5 million bits per second (megabits per second, or 1.5 Mbps), it may only reach this once in a while—no one can “clog the tubes” without paying massive amounts of money for the service. Theoretically, the company will then invest this service fee in building more robust “tubes.”
Net neutrality is difficult to legislate in part because it can be confusing: It relies on understanding how the Internet works and how communications are regulated. Stevens’s metaphor is misleading because it assumes that Internet capacity is not already regulated in some natural way. To use the superhighway analogy, Stevens is suggesting that the highways are congested, and his solution is to allow companies to dedicate express lanes for high-paying customers (it should be noted that the revenue would go to the service providers, even though the government has chipped in quite a bit for information superhighway construction). The danger of this is that it would be very difficult for a small business or personal site to afford express-lane access. Worse yet, the pro–net neutrality organization Save the Internet says that a lack of legislation would allow companies to “discriminate in favor of their own search engines” and “leave the rest of us on a winding dirt road (Save the Internet, 2010).” For areas that only have access to one Internet service, this would amount to a lack of access to all the available content.
Digital Technology and Electronic Media
Content on the Internet competes with content from other media outlets. Unlimited and cheap digital duplication of content removes the concept of scarcity from the economic model of media; it is no longer necessary to buy a physical CD fabricated by a company in order to play music, and digital words on a screen convey the news just as well as words printed on physical newspaper. Media companies have been forced to reinvent themselves as listeners, readers, and watchers have divided into smaller and smaller subcategories.
Traditional media companies have had to evolve to adapt to the changes wrought by the Internet revolution, but these media are far from obsolete in an online world. For example, social media can provide a very inexpensive and reliable model for maintaining a band’s following. A record company (or the band itself) can start a Facebook page, through which it can notify all its fans about new albums and tour dates—or even just remind fans that it still exists. MySpace has been (and still is, to an extent) one of the main musical outlets on the Internet. This free service comes with a small web-based music player that allows people interested in the band to listen to samples of its music. Coupling free samples with social networking allows anyone to discover a band from anywhere in the world, leading to the possibility of varying and eclectic tastes not bound by geography.
Key Takeaways
- On one hand, the information superhighway has opened up rural areas to global connections and made communication and trade much easier. One downside, however, is that illicit and unwanted information can move just as quickly as positive information—it is up to the recipient to decide.
- The lack of authorial attribution on many online forums can make it difficult to find credible information. However, Wikipedia’s concept of “verifiability,” or citing verified sources, has provided a good check to what can become a matter of mere my-word-against-yours. It is important to gauge possible bias in a source and to judge whether the author has an economic interest in the information.
- Net neutrality is a general category of laws that seek to make it illegal for service providers to discriminate among types of Internet content. One downside of content discrimination is that a service provider could potentially make competitors’ sites load much more slowly than their own.
Exercises
- Find a website about a single product, musician, or author. Does the site have a stated or strongly implied author?
- Look for a copyright notice and a date, usually at the bottom of the page. How might that author’s point of view bias the information on the site?
- How can one determine the author’s credibility?
End-of-Chapter Assessment
Review Questions
-
Section 1
- What are two of the original characteristics of the Internet, and how do they continue to affect it?
- What were some of the technological developments that had a part in the “democratization” of the Internet, or the spread of the Internet to more people?
- What were the causes and effects of the dot-com boom and crash? How did the dot-com boom and crash influence the Internet in later years, particularly with regards to content providers’ income streams?
-
Section 2
- What are some of the differences between social networking sites, and how do they reflect a tendency to cater to a specific demographic?
- How might blogs help the flow of information around the world? How might they damage that information?
- How has privacy been treated on social networking sites, and how does this affect the culture?
- How have marketers tried to use social networking to their advantage?
-
Section 3
- How has globalization on the Internet changed the way culture is distributed?
- What are the implications of the Internet overtaking print media as a primary source for news, and how might that affect the public discourse?
- What is the “Internet paradox,” and how have various websites and services tried to combat it? How do Internet users socialize on the Internet?
-
Section 4
- How does the metaphor of an “information superhighway” relate to both the positive and negative aspects of the Internet?
- What are some threats to credibility online, and how can users proactively seek only credible sources?
- What is net neutrality, and how could it change the way we access information on the Internet?
- How has the Internet affected the music business? How has the Internet affected the music from an artistic perspective?
Critical Thinking Questions
- One of the repeated promises of the Internet is that it is truly democratic and that anyone can have a voice. Has this played out in a viable way, or was that a naive assumption that never really came to fruition?
- How do the concepts of decentralization and protocol play a part in the way the Internet works?
- How have social networks transformed marketing? What are some of the new ways that marketers can target specific people?
- How has the Internet changed the way people socialize online? Are there entirely new forms of socializing that did not exist before the Internet?
- How has the concept of verifiability changed the way that “truth” is regarded on the Internet—even in the culture at large? Has the speed and volume with which new information becomes available on the Internet made verifiable information more difficult to come by?
Career Connection
There is a constantly growing market for people who know how to use social media effectively. Often, companies will hire someone specifically to manage their Facebook and Twitter feeds as another aspect of public relations and traditional marketing.
Read the article “5 True Things Social Media Experts Do Online,” written by social media writer Glen Allsopp. You can find it at http://www.techipedia.com/2010/social-media-expert-skills/.
Then, explore the site of Jonathan Fields, located at http://www.jonathanfields.com/blog/. After exploring for a bit, read the “About” section (the link is at the top). These two sites will help you answer the following questions:
- How has Jonathan Fields made “everything else irrelevant”? What are some of the indications that he gives in his biography that he is passionate about his industry doing well?
- Review Jonathan’s Twitter feed on the right column of his site. Who are some of the other people he features, and how might this relate to Glen Allsopp’s advice to “highlight others”?
- Also look at Jonathan’s “Small Business Marketing” section. What are some of the things he does to help businesses reach customers? How might this be potentially rewarding?
- Think about the ways that you may use social media in your own life, and how you might be able to use those skills to help a business. Pick an activity that you might (or do) participate in online and write down how you might do the same thing from the perspective of a company. For example, how would you write the “About Me” section of a company’s Facebook profile? How could you start turning this skill into a career?
References
Basham, Patrick. “Live Earth’s Inconvenient Truths,” Cato Institute, July 11, 2007, http://www.cato.org/pub_display.php?pub_id=8497.
Cannon, Robert. “The Legacy of the Federal Communications Commission’s Computer Inquiries,” Federal Communication Law Journal 55, no. 2 (2003): 170.
Curtis, Alex. “Senator Stevens Speaks on Net Neutrality,” Public Knowledge, June 28, 2006, http://www.publicknowledge.org/node/497.
Exxon, Exxpose. “Global Warming Deniers and ExxonMobil,” 2006, http://www.exxposeexxon.com/facts/gwdeniers.html.
Gorman, Siobhan and Evan Ramstad, “Cyber Blitz Hits U.S., Korea,” Wall Street Journal, July 9, 2009, http://online.wsj.com/article/SB124701806176209691.html.
Greenpeace, ExxonMobil 2006 Contributions and Community Investments, October 5, 2007, http://research.greenpeaceusa.org/?a=view&d=4381.
Jesdanun, Anick. “High Expectations for the Internet,” December 30, 2002, http://www.crn.com/it-channel/18822182;jsessionid=3Z2ILJNFKM1FZQE1GHPCKH4ATMY32JVN.
Kirk, Elizabeth E. “Evaluating Information Found on the Internet,” Sheridan Libraries, Johns Hopkins University, 1996, http://www.library.jhu.edu/researchhelp/general/evaluating/.
Revkin, Andrew C. “Hacked E-Mail Is New Fodder for Climate Dispute,” New York Times, November 20, 2009, http://www.nytimes.com/2009/11/21/science/earth/21climate.html.
Save the Internet, “FAQs,” 2010, http://www.savetheinternet.com/faq.
The Biz Media, “Video and the Information Superhighway: An Artist’s Perspective,” The Biz Media, May 3, 2010, http://blog.thebizmedia.com/video-and-the-information-superhighway/.
U.S. Department of Education, Toward a New Golden Age in American Education: How the Internet, the Law and Today’s Students Are Revolutionizing Expectations, National Education Technology Plan, 2004, http://www2.ed.gov/about/offices/list/os/technology/plan/2004/site/theplan/edlite-intro.html.
Wikipedia, s.v. “Wikipedia:Neutral point of view,” http://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view.
Wikipedia, s.v. “Wikipedia:Verifiability,” http://en.wikipedia.org/wiki/Wikipedia:Verifiability.
Zetter, Kim. “Google to Stop Censoring Search Results in China After Hack Attack,” Wired, January 12, 2010, http://www.wired.com/threatlevel/2010/01/google-censorship-china/.
Zetter, Kim. “Nov. 10, 1983: Computer ‘Virus’ Is Born,” Wired, November 10, 2009, http://www.wired.com/thisdayintech/2009/11/1110fred-cohen-first-computer-virus/.
Chapter 12: Advertising and Public Relations
12.1 Advertising
12.2 Public Relations
12.1 Advertising
Learning Objectives
- Describe important eras in the history of American advertising.
- Analyze the overall effects of government regulation on advertising.
- Identify the types of advertising used today.
- Describe the impact of advertising on American consumerism and cultural values.
Advertising is defined as promoting a product or service through the use of paid announcements (Dictionary). These announcements have had an enormous effect on modern culture, and thus deserve a great deal of attention in any treatment of the media’s influence on culture.
History of Advertising
Advertising dates back to ancient Rome’s public markets and forums and continues into the modern era in most homes around the world. Contemporary consumers relate to and identify with brands and products. Advertising has inspired an independent press and conspired to encourage carcinogenic addictions. An exceedingly human invention, advertising is an unavoidable aspect of the shared modern experience.
Ancient and Medieval Advertising
In 79 CE, the eruption of Italy’s Mount Vesuvius destroyed and, ultimately, preserved the ancient city of Pompeii. Historians have used the city’s archaeological evidence to piece together many aspects of ancient life. Pompeii’s ruins reveal a world in which the fundamental tenets of commerce and advertising were already in place. Merchants offered different brands of fish sauces identified by various names such as “Scaurus’ tunny jelly.” Wines were branded as well, and their manufacturers sought to position them by making claims about their prestige and quality. Toys and other merchandise found in the city bear the names of famous athletes, providing, perhaps, the first example of endorsement techniques (Hood, 2005).
The invention of the printing press in 1440 made it possible to print advertisements that could be put up on walls and handed out to individuals. By the 1600s, newspapers had begun to include advertisements on their pages. Advertising revenue allowed newspapers to print independently of secular or clerical authority, eventually achieving daily circulation. By the end of the 16th century, most newspapers contained at least some advertisements (O’Barr, 2005).
Selling the New World
European colonization of the Americas during the 1600s brought about one of the first large-scale advertising campaigns. When European trading companies realized that the Americas held economic potential as a source of natural resources such as timber, fur, and tobacco, they attempted to convince others to cross the Atlantic Ocean and work to harvest this bounty. The advertisements for this venture described a paradise without beggars and with plenty of land for those who made the trip. The advertisements convinced many poor Europeans to become indentured servants to pay for the voyage (Mierau, 2000).
Nineteenth-Century Roots of Modern Advertising
The rise of the penny press during the 1800s had a profound effect on advertising. The New York Sun embraced a novel advertising model in 1833 that allowed it to sell issues of the paper for a trifling amount of money, ensuring a higher circulation and a wider audience. This larger audience in turn justified greater prices for advertisements, allowing the paper to make a profit from its ads rather than from direct sales (Vance).
P. T. Barnum and Advertising
The career of P. T. Barnum, cofounder of the famed Barnum & Bailey circus, gives a sense of the uncontrolled nature of advertising during the 1800s. He began his career in the 1840s writing ads for a theater, and soon after, he began promoting his own shows. He advertised these shows any way he could, using not only interesting newspaper ads but also bands of musicians, paintings on the outside of his buildings, and street-spanning banners.
Barnum also learned the effectiveness of using the media to gain attention. In an early publicity stunt, Barnum hired a man to wordlessly stack bricks at various corners near his museum during the hours preceding a show. When this activity drew a crowd, the man went to the museum and bought a ticket for the show. This stunt drew such large crowds over the next 2 days, that the police made Barnum put a halt to it, gaining it even wider media attention. Barnum was sued for fraud over a bearded woman featured in one of his shows; the plaintiffs claimed that she was, in fact, a man. Rather than trying to keep the trial quiet, Barnum drew attention to it by parading a crowd of witnesses attesting to the bearded woman’s gender, drawing more media attention—and more customers.
Barnum aimed to make his audience think about what they had seen for an extended time. His Feejee mermaid—actually a mummified monkey and fish sewn together—was not necessarily interesting because viewers thought the creation was really a mermaid, but because they weren’t sure if it was or not. Such marketing tactics brought Barnum’s shows out of his establishments and into social conversations and newspapers (Applegate, 1998). Although most companies today would eschew Barnum’s outrageous style, many have used the media and a similar sense of mystery to promote their products. Apple, for example, famously keeps its products such as the iPhone and iPad under wraps, building media anticipation and coverage.
In 1843, a salesman named Volney Palmer founded the first U.S. advertising agency in Philadelphia. The agency made money by linking potential advertisers with newspapers. By 1867, other agencies had formed, and advertisements were being marketed at the national level. During this time, George Rowell, who made a living buying bulk advertising space in newspapers to subdivide and sell to advertisers, began conducting market research in its modern recognizable form. He used surveys and circulation counts to estimate numbers of readers and anticipate effective advertising techniques. His agency gained an advantage over other agencies by offering advertising space most suited for a particular product. This trend quickly caught on with other agencies. In 1888, Rowell started the first advertising trade magazine, Printers’ Ink (Gartrell).
In Chapter 5 “Magazines”, you read about McClure’s success in 1893 thanks to an advertising model: selling issues for nearly half the price of other magazines and depending on advertising revenues to make up the difference between cost and sales price. Magazines such as Ladies’ Home Journal focused on specific audiences, so they allowed advertisers to market products designed for a specific demographic. By 1900, Harper’s Weekly, once known for refusing advertising, featured ads on half of its pages (All Classic Ads).
The Rise of Brand Names
Another ubiquitous aspect of advertising developed around this time: brands. During most of the 19th century, consumers purchased goods in bulk, weighing out scoops of flour or sugar from large store barrels and paying for them by the pound. Innovations in industrial packaging allowed companies to mass produce bags, tins, and cartons with brand names on them. Although brands existed before this time, they were generally reserved for goods that were inherently recognizable, such as china or furniture. Advertising a particular kind of honey or flour made it possible for customers to ask for that product by name, giving it an edge over the unnamed competition.1
The rise of department stores during the late 1800s also gave brands a push. Nationwide outlets such as Sears, Roebuck & Company and Montgomery Ward sold many of the same items to consumers all over the country. A particular item spotted in a big-city storefront could come to a small-town shopper’s home thanks to mail-order catalogs. Customers made associations with the stores, trusting them to have a particular kind of item and to provide quality wares. Essentially, consumers came to trust the store’s name rather than its specific products.2
Advertising Gains Stature During the 20th Century
Although advertising was becoming increasingly accepted as an element of mass media, many still regarded it as an unseemly occupation. This attitude began to change during the early 20th century. As magazines—widely considered a highbrow medium—began using more advertising, the advertising profession began attracting more artists and writers. Writers used verse and artists produced illustrations to embellish advertisements. Not surprisingly, this era gave rise to commercial jingles and iconic brand characters such as the Jolly Green Giant and the Pillsbury Doughboy.
The household cleaner Sapolio produced advertisements that made the most of the artistic advertising trend. Sapolio’s ads featured various drawings of the residents of “Spotless Town” along with a rhymed verse celebrating the virtues of this fictional haven of cleanliness. The public anticipated each new ad in much the same way people today anticipate new TV episodes. In fact, the ads became so popular that citizens passed “Spotless Town” resolutions to clean up their own jurisdictions. Advertising trends later moved away from flowery writing and artistry, but the lessons of those memorable campaigns continued to influence the advertising profession for years to come (Fox, 1984).
Advertising Makes Itself Useful
World War I fueled an advertising and propaganda boom. Corporations that had switched to manufacturing wartime goods wanted to stay in the public eye by advertising their patriotism. Equally, the government needed to encourage public support for the war, employing such techniques as the famous Uncle Sam recruiting poster. President Woodrow Wilson established the advertiser-run Committee on Public Information to make movies and posters, write speeches, and generally sell the war to the public. Advertising helped popularize World War I on the home front, and the war in turn gave advertising a much-needed boost in stature. The postwar return to regular manufacturing initiated the 1920s as an era of unprecedented advertising.3
New Media
The rising film industry made celebrity testimonials, or product endorsements, an important aspect of advertising during the 1920s. Film stars including Clara Bow and Joan Crawford endorsed products such as Lux toilet soap. In these early days of mass-media consumer culture, film actors and actresses gave the public figures to emulate as they began participating in popular culture.4
As discussed in Chapter 7 “Radio”, radio became an accepted commercial medium during the 1920s. Although many initially thought radio was too intrusive a medium to allow advertising, as it entered people’s homes by the end of the decade, advertising had become an integral aspect of programming. Advertising agencies often created their own programs that networks then distributed. As advertisers conducted surveys and researched prime time slots, radio programming changed to appeal to their target demographics. The famous Lux Radio Theater, for example, was named for and sponsored by a brand of soap. Product placement was an important part of these early radio programs. Ads for Jell-O appeared during the course of the Jack Benny Show (JackBennyShow.com), and Fibber McGee and Molly scripts often involved their sponsor’s floor wax (Burgan, 1996). The relationship between a sponsor and a show’s producers was not always harmonious; the producers of radio programs were constrained from broadcasting any content that might reflect badly on their sponsor.
The Great Depression and Backlash
Unsurprisingly, the Great Depression, with its widespread decreases in levels of income and buying power, had a negative effect on advertising. Spending on ads dropped to a mere 38 percent of its previous level. Social reformers added to revenue woes by again questioning the moral standing of the advertising profession. Books such as Through Many Windows and Our Master’s Voice portrayed advertisers as dishonest and cynical, willing to say anything to make a profit and unconcerned about their influence on society. Humorists also questioned advertising’s authority. The Depression-era magazine Ballyhoo regularly featured parodies of ads, similar to those seen later on Saturday Night Live or in The Onion. These ads mocked the claims that had been made throughout the 1920s, further reducing advertising’s public standing.5
This advertising downturn lasted only as long as the Depression. As the United States entered World War II, advertising again returned to encourage public support and improve the image of businesses.6 However, there was one lasting effect of the Depression. The rising consumer movement made false and misleading advertising a major public policy issue. At the time, companies such as Fleischmann’s (which claimed its yeast could cure crooked teeth) were using advertisements to pitch misleading assertions. Only business owners’ personal morals stood in the way of such claims until 1938, when the federal government created the Federal Trade Commission (FTC) and gave it the authority to halt false advertising.
In 1955, TV outpaced all other media for advertising. TV provided advertisers with unique, geographically oriented mass markets that could be targeted with regionally appropriate ads (Samuel, 2006). The 1950s saw a 75 percent increase in advertising spending, faster than any other economic indicator at the time.7
Single sponsors created early TV programs. These sponsors had total control over programs such as Goodyear TV Playhouse and Kraft Television Theatre. Some sponsors went as far as to manipulate various aspects of the programs. In one instance, a program run by the DeSoto car company asked a contestant to use a false name rather than his given name, Ford. The present-day network model of TV advertising took hold during the 1950s as the costs of TV production made sole sponsorship of a show prohibitive for most companies. Rather than having a single sponsor, the networks began producing their own shows, paying for them through ads sold to a number of different sponsors.8 Under the new model of advertising, TV producers had much more creative control than they had under the sole-sponsorship model.
The quiz shows of the 1950s were the last of the single-sponsor–produced programs. In 1958, when allegations of quiz show fraud became national news, advertisers moved out of programming entirely. The quiz show scandals also added to an increasing skepticism of ads and consumer culture (Boddy, 1990).
Advertising research during the 1950s had used scientifically driven techniques to attempt to influence consumer opinion. Although the effectiveness of this type of advertising is questionable, the idea of consumer manipulation through scientific methods became an issue for many Americans. Vance Packard’s best-selling 1957 book The Hidden Persuaders targeted this style of advertising. The Hidden Persuaders and other books like it were part of a growing critique of 1950s consumer culture. The U.S. public was becoming increasingly wary of advertising claims—not to mention increasingly weary of ads themselves. A few adventurous ad agencies used this consumer fatigue to usher in a new era of advertising and American culture (Frank, 1998).
The Creative Revolution
Burdened by association with Nazi Germany, where the company had originated, Volkswagen took a daring risk during the 1950s. In 1959, the Doyle Dane Bernbach (DDB) agency initiated an ad campaign for the company that targeted skeptics of contemporary culture. Using a frank personal tone with the audience and making fun of the planned obsolescence that was the hallmark of Detroit automakers, the campaign stood apart from other advertisements of the time. It used many of the consumer icons of the 1950s, such as suburbia and game shows, in a satirical way, pitting Volkswagen against mainstream conformity and placing it strongly on the side of the consumer. By the end of the 1960s, the campaign had become an icon of American anticonformity. In fact, it was such a success that other automakers quickly emulated it. Ads for the Dodge Fever, for example, mocked corporate values and championed rebellion.9
This era of advertising became known as the creative revolution for its emphasis on creativity over straight salesmanship. The creative revolution reflected the values of the growing anticonformist movement that culminated in the countercultural revolution of the 1960s. The creativity and anticonformity of 1960s advertising quickly gave way to more product-oriented conventional ads during the 1970s. Agency conglomeration, a recession, and cultural fallout were all factors in the recycling of older ad techniques. Major TV networks dropped their long-standing ban on comparative advertising early in the decade, leading to a new trend in positioning ads that compared products. Advertising wars such as Coke versus Pepsi and, later, Microsoft versus Apple were products of this trend.10
Innovations in the 1980s stemmed from a new TV channel: MTV. Producers of youth-oriented products created ads featuring music and focusing on stylistic effects, mirroring the look and feel of music videos. By the end of the decade, this style had extended to more mainstream products. Campaigns for the pain reliever Nuprin featured black-and-white footage with bright yellow pills, whereas ads for Michelob used grainy atmospheric effects (New York Times, 1989).
Advertising Stumbles
During the late 1980s, studies showed that consumers were trending away from brands and brand loyalty. A recession coupled with general consumer fatigue led to an increase in generic brand purchases and a decrease in advertising. In 1983, marketing budgets allocated 70 percent of their expenditures to ads and the remaining 30 percent to other forms of promotion. By 1993, only 25 percent of marketing budgets were dedicated to ads (Klein, 2002).
These developments resulted in the rise of big-box stores such as Wal-Mart that focused on low prices rather than expensive name brands. Large brands remade themselves during this period to focus less on their products and more on the ideas behind the brand. Nike’s “Just Do It” campaign, endorsed by basketball star Michael Jordan, gave the company a new direction and a new means of promotion. Nike representatives have stated they have become more of a “marketing-oriented company” as opposed to a product manufacturer.11
As large brands became more popular, they also attracted the attention of reformers. Companies such as Starbucks and Nike bore the brunt of late 1990s sweatshop and labor protests. As these brands attempted to incorporate ideas outside of the scope of their products, they also came to represent larger global commerce forces (Hornblower, 2000). This type of branding increasingly incorporated public relations techniques that will be discussed later in this chapter.
The Rise of Digital Media
Twenty-first-century advertising has adapted to new forms of digital media. Internet outlets such as blogs, social media forums, and other online spaces have created new possibilities for advertisers, and shifts in broadcasting toward Internet formats have threatened older forms of advertising. Video games, smartphones, and other technologies also present new possibilities. Specific new media advertising techniques will be covered in the next section.
Types of Advertising
Despite the rise of digital media, many types of traditional advertising have proven their enduring effectiveness. Local advertisers and large corporations continue to rely on billboards and direct-mail fliers. In 2009, Google initiated a billboard campaign for its Google Apps products that targeted business commuters. The billboards featured a different message every day for an entire month, using simple computer text messages portraying a fictitious executive learning about the product. Although this campaign was integrated with social media sites such as Twitter, its main thrust employed the basic billboard (Ionescu, 2009).
Newspapers and Magazines
Although print ads have been around for centuries, Internet growth has hit newspaper advertising hard. A 45 percent drop in ad revenue between 2007 and 2010 signaled a catastrophic decline for the newspaper industry (Sterling, 2010). Traditionally, newspapers have made money through commercial and classified advertising. Commercial advertisers, however, have moved to electronic media forms, and classified ad websites such as Craigslist offer greater geographic coverage for free. The future of newspaper advertising—and of the newspaper industry as a whole—is up in the air.
Print magazines have suffered from many of the same difficulties as newspapers. Declining advertising revenue has contributed to the end of popular magazines such as Gourmet and to the introduction of new magazines that cross over into other media formats, such as Food Network Magazine. Until a new, effective model is developed, the future of magazine advertising will continue to be in doubt.
Radio
Compared to newspapers and magazines, radio’s advertising revenue has done well. Radio’s easy adaptation to new forms of communication has made it an easy sell to advertisers. Unlike newspapers, radio ads target specific consumers. Advertisers can also pay to have radio personalities read their ads live in the studio, adding a sense of personal endorsement to the business or product. Because newer forms of radio such as satellite and Internet stations have continued to use this model, the industry has not had as much trouble adapting as print media have.
Television
TV advertisement relies on verbal as well as visual cues to sell items. Promotional ad time is purchased by the advertiser, and a spot usually runs 15 to 30 seconds. Longer ads, known as infomercials, run like a TV show and usually aim for direct viewer response. New technologies such as DVR allow TV watchers to skip through commercials; however, studies have shown that these technologies do not have a negative effect on advertising (Gallagher, 2010). This is partly due to product placement. Product placement is an important aspect of TV advertising, because it incorporates products into the plots of shows. Although product placement has been around since the 1890s, when the Lumière brothers first placed Lever soap in their movies, the big boom in product placement began with the reality TV show Survivor in 2000 (Anderson, 2006). Since then, product placement has been a staple of prime-time entertainment. Reality TV shows such as Project Runway and American Idol are known for exhibiting products on screen, and talk-show host Oprah Winfrey made news in 2004 when she gave away new Pontiacs to her audience members (Stansky, 2008). Even children’s shows are known to hock products; a new cartoon series recently began on Nickelodeon featuring characters that represent different Sketchers sneakers (Freidman, 2010).
Digital Media
Emerging digital media platforms such as the Internet and mobile phones have created many new advertising possibilities. The Internet, like TV and radio, offers free services in exchange for advertising exposure. However, unlike radio or TV, the Internet is a highly personalized experience that shares private information.
Viral Ads
As you read in Chapter 11 “The Internet and Social Media”, new advertising techniques have become popular on the Internet. Advertisers have tried to capitalize on the shared-media phenomenon by creating viral ads that achieve spontaneous success online. Fewer than one in six ads that are intended to go viral actually succeed, so marketers have developed strategies to encourage an advertisement’s viral potential. Successful spots focus on creativity rather than a hard-selling strategy and generally target a specific audience (Fox Business, 2010). Recent Old Spice ads featured former NFL player Isaiah Mustafa in a set of continuous scenes, from a shower room to a yacht. The commercial ends with the actor on horseback, a theatrical trick that left viewers wondering how the stunt was pulled off. As of July 2010, the ad was the most popular video on YouTube with more than 94 million views, and Old Spice sales had risen 106 percent (Neff, 2010).
Social Media
Social media sites such as Facebook use the information users provide on their profiles to generate targeted advertisements. For instance, if a person is a fan of Mariah Carey or joined a group associated with the singer, he or she might see announcements advertising her new CD or a local concert. While this may seem harmless, clicking on an ad sends user data to the advertising company, including name and user ID. Many people have raised privacy concerns over this practice, yet it remains in use. Free e-mail services such as Gmail also depend on targeted advertising for their survival. Indeed, advertising is the only way such services could continue. Given the ongoing privacy debates concerning targeted Internet advertising, a balance between a user’s privacy and accessibility of services will have to be settled in the near future.
Mobile Phones
Mobile phones provide several different avenues for advertisers. The growing use of Internet radio through mobile-phone platforms has created a market for advertisements tapped by radio advertising networks such as TargetSpot. By using the radio advertising model for mobile phones, users receive increased radio broadcast options and advertisers reach new targeted markets (Marketwire, 2010).
Another development in the mobile-phone market is the use of advertising in smartphone apps. Free versions of mobile-phone applications often include advertising to pay for the service. Popular apps such as WeatherBug and Angry Birds offer free versions with ads in the margins; however, users can avoid these ads by paying a few dollars to upgrade to “Pro” versions. Other apps such as Foursquare access a user’s geographic location and offer ads for businesses within walking distance (Fairlee, 2010).
Government Regulation of Advertising
Advertising regulation has played an important role in advertising’s history and cultural influence. One of the earliest federal laws addressing advertising was the Pure Food and Drug Law of 1906. A reaction to public outcry over the false claims of patent medicines, this law required informational labels to be placed on these products. It did not, however, address the questionable aspects of the advertisements, so it did not truly delve into the issue of false advertising.12
The Formation of the FTC
Founded in 1914, the Federal Trade Commission (FTC) became responsible for regulating false advertising claims. Although federal laws concerning these practices made plaintiffs prove that actual harm was done by the advertisement, state laws passed during the early 1920s allowed prosecution of misleading advertisements regardless of harm done.13 The National Association of Attorneys General has helped states remain an important part of advertising regulation. In 1995, 13 states passed laws that required sweepstakes companies to provide full disclosure of rules and details of contests (O’Guinn, et. al., 2009).
During the Great Depression, New Deal legislation threatened to outlaw any misleading advertising, a result of the burgeoning consumer movement and the public outcry against advertising during the period (Time, 1941). The reformers did not fully achieve their goals, but they did make a permanent mark on advertising history. The 1938 Wheeler-Lea Amendment expanded the FTC’s role to protect consumers from deceptive advertising. Until this point, the FTC was responsible for addressing false advertising complaints from competitors. With this legislation, the agency also became an important resource for the consumer movement.
Truth in Advertising
In 1971, the FTC began the Advertising Substantiation Program to force advertisers to provide evidence for the claims in their advertisements. Under this program, the FTC gained the power to issue cease-and-desist orders to advertisers regarding specific ads in question and to order corrective advertising. Under this provision, the FTC can force a company to issue an advertisement acknowledging and correcting an earlier misleading ad. Regulations under this program established that supposed experts used in advertisements must be qualified experts in their field, and celebrities must actually use the products they endorse.14 In 2006, Sunny Health Nutrition was brought to court for advertising height-enhancing pills called HeightMax. The FTC found the company had hired an actor to appear as an expert in its ads, and that the pills did not live up to their claim. Sunny Health Nutrition was forced to pay $375,000 to consumers for misrepresenting its product (Consumer Affairs, 2006).
In 1992, the FTC introduced guidelines defining terms such as biodegradable and recyclable. The growth of the environmental movement in the early 1990s led to an upsurge in environmental claims by manufacturers and advertisers. For example, Mobil Oil claimed their Hefty trash bags were biodegradable. While technically this statement is true, a 500- to 1,000-year decomposition cycle does not meet most people’s definitions of the term (Lapidos, 2007). The FTC guidelines made such claims false by law (Schneider, 1992).
Regulation of the Internet
The FTC has also turned its attention to online advertising. The Children’s Online Privacy Act of 1998 was passed to prohibit companies from obtaining the personal information of children who access websites or other online resources. Because of the youth orientation of the Internet, newer advertising techniques have drawn increasing criticism. Alcohol companies in particular have come under scrutiny. Beer manufacturer Heineken’s online presence includes a virtual city in which users can own an apartment and use services such as e-mail. This practice mirrors that of children’s advertising, in which companies often create virtual worlds to immerse children in their products. However, the age-verification requirements to participate in this type of environment are easily falsified and can lead to young children being exposed to more mature content (Gardner, 2010).
Consumer and privacy advocates who are concerned over privacy intrusions by advertisers have also called for Internet ad regulation. In 2009, the FTC acted on complaints against Sears that resulted in an injunction against the company for not providing sufficient disclosure. Sears offered $10 to consumers to download a program that tracked their Internet browsing. The FTC came down on Sears because the downloaded software tracked sensitive information that was not fully disclosed to the consumer. Similar consumer complaints against Facebook and Google for their consumer tracking have, at present, not resulted in FTC actions; however, the growing outcry makes new regulation of Internet advertising likely (Shields, 2010).
Advertising’s Influence on Culture
Discussing advertising’s influence on culture raises a long-standing debate. One opinion states that advertising simply reflects the trends inherent in a culture, the other claims advertising takes an active role in shaping culture. Both ideas have merit and are most likely true to varying degrees.
Advertising and the Rise of Consumer Culture
George Babbitt, the protagonist of Sinclair Lewis’s 1922 novel Babbitt, was a true believer in the growing American consumer culture:
Just as the priests of the Presbyterian Church determined his every religious belief…so did the national advertisers fix the surface of his life, fix what he believed to be his individuality. These standard advertised wares—toothpastes, socks, tires, cameras, instantaneous hot-water heaters—were his symbols and proofs of excellence; at first the signs, and then the substitutes, for joy and passion and wisdom (Lewis, 1922).
Although Lewis’s fictional representation of a 1920s-era consumer may not be an actual person, it indicates the national consumer culture that was taking shape at the time. As it had always done, advertising sought to attach products to larger ideas and symbols of worth and cultural values. However, the rise of mass media and of the advertising models that these media embraced made advertising take on an increasingly influential cultural role.
Automobile ads of the 1920s portrayed cars as a new, free way of life rather than simply a means of transportation. Advertisers used new ideas about personal hygiene to sell products and ended up breaking taboos about public discussion of the body. The newly acknowledged epidemics of halitosis and body odor brought about products such as mouthwash and deodorant. A Listerine campaign of the era transformed bad breath from a nuisance into the mark of a sociopath (Ashenburg, 2008). Women’s underwear and menstruation went from being topics unsuitable for most family conversations to being fodder for the pages of national magazines.15
Creating the Modern World
World War I bond campaigns had made it clear that advertising could be used to influence public beliefs and values. Advertising focused on the new—making new products and ideas seem better than older ones and ushering in a sense of the modernity. In an address to the American Association of Advertising Agencies in 1926, President Coolidge went as far as to hold advertisers responsible for the “regeneration and redemption of mankind (Marchand, 1985).”
Up through the 1960s, most advertising agencies were owned and staffed by affluent white men, and advertising’s portrayals of typical American families reflected this status quo. Mainstream culture as propagated by magazine, radio, and newspaper advertising was that of middle- or upper-class White suburban families (Marchand, 1985). This sanitized image of the suburban family, popularized in such TV programs as Leave It to Beaver, has been mercilessly satirized since the cultural backlash of the 1960s.
A great deal of that era’s cultural criticism targeted the image of the advertiser as a manipulator and promulgator of superficial consumerism. When advertisers for Volkswagen picked up on this criticism, turned it to their advantage, and created a new set of consumer symbols that would come to represent an age of rebellion, they neatly co-opted the arguments against advertising for their own purposes. In many instances, advertising has functioned as a codifier of its own ideals by taking new cultural values and turning them into symbols of a new phase of consumerism. This is the goal of effective advertising.
Apple’s 1984 campaign is one of the most well-known examples of defining a product in terms of new cultural trends. A fledgling company compared to computer giants IBM and Xerox, Apple spent nearly $2 million on a commercial that would end up only being aired once (McAloney, 1984). During the third quarter of the 1984 Super Bowl, viewers across the United States watched in amazement as an ad unlike any other at the time appeared on their TV screens. The commercial showed a drab gray auditorium where identical individuals sat in front of a large screen. On the screen was a man, addressing the audience with an eerily captivating voice. “We are one people, with one will,” he droned. “Our enemies shall talk themselves to death. And we will bury them with their own confusion. We shall prevail (McAloney, 1984)!” While the audience sat motionlessly, one woman ran forward with a sledgehammer and threw it at the screen, causing it to explode in a flash of light and smoke. As the scene faded out, a narrator announced the product. “On January 24, Apple Computer will introduce the Macintosh. And you’ll see why 1984 won’t be like 1984 (Freidman, 1984).” With this commercial, Apple defined itself as a pioneer of the new generation. Instead of marketing its products as utilitarian tools, it advertised them as devices for combating conformity (Freidman, 1984). Over the next few decades, other companies imitated this approach, presenting their products as symbols of cultural values.
In his study of advertising’s cultural impact, The Conquest of Cool, Thomas Frank compares the advertising of the 1960s with that of the early 1990s:
How [advertisers] must have rejoiced when the leading minds of the culture industry announced the discovery of an all-new angry generation, the “Twenty-Somethings,” complete with a panoply of musical styles, hairdos, and verbal signifiers ready-made to rejuvenate advertising’s sagging credibility…. The strangest aspect of what followed wasn’t the immediate onslaught of even hipper advertising, but that the entire “Generation X” discourse repeated…the discussions of youth culture that had appeared in Advertising Age, Madison Avenue, and on all those youth-market panel discussions back in the sixties.16
To be clear, advertisers have not set out to consciously manipulate the public in the name of consumer culture. Rather, advertisers are simply doing their job—one that has had an enormous influence on culture.
Advertising Stereotypes
The White, middle-class composition of ad agencies contributed to advertisements’ rare depictions of minority populations. DDB—the agency responsible for the Volkswagen ads of the 1960s—was an anomaly in this regard. One of its more popular ads was for Levy’s rye bread. Most conventional advertisers would have ignored the ethnic aspects of this product and simply marketed it to a mainstream White audience. Instead, the innovative agency created an ad campaign that made ethnic diversity a selling point, with spots featuring individuals from a variety of racial backgrounds eating the bread with the headline “You don’t have to be Jewish to love Levy’s.”
During the 1950s, stereotypical images of African Americans promulgated by advertisers began to draw criticism from civil rights leaders. Icons such as Aunt Jemima, the Cream of Wheat chef, and the Hiram Walker butler were some of the most recognizable black figures in U.S. culture. Unlike the African Americans who had gained fame through their artistry, scholarship, and athleticism, however, these advertising characters were famous for being domestic servants.
During the 1960s, meetings of the American Association of Advertising Agencies (AAAA) hosted civil rights leaders, and agencies began to respond to the criticisms of bias. A New York survey in the mid-1960s discovered that Blacks were underrepresented at advertising agencies. Many agencies responded by hiring new African American employees, and a number of Black-owned agencies started in the 1970s.17
Early advertising frequently reached out to women because they made approximately 80 percent of all consumer purchases. Thus, women were well represented in advertising. However, those depictions presented women in extremely narrow roles. Through the 1960s, ads targeting women generally showed them performing domestic duties such as cooking or cleaning, whereas ads targeting men often placed women in a submissive sexual role even if the product lacked any overt sexual connotation. A National Car Rental ad from the early 1970s featured a disheveled female employee in a chair with the headline “Go Ahead, Take Advantage of Us.” Another ad from the 1970s pictured a man with new Dacron slacks standing on top of a woman, proclaiming, “It’s nice to have a girl around the house (Frauenfelder, 2008).”
An advertising profile printed in Advertising Age magazine gave a typical advertiser’s understanding of the housewife at the time:
She likes to watch TV and she does not enjoy reading a great deal. She is most easily reached through TV and the simple down-to-earth magazines…. Mental activity is arduous for her…. She is a person who wants to have things she can believe in rather than things she can think about (Rodnitzky, 1999).
The National Organization for Women (NOW) created a campaign during the early 1970s targeting the role of women in advertisements. Participants complained about the ads to networks and companies and even spray-painted slogans on offensive billboards in protest.
Representation of minorities and women in advertising has improved since the 1960s and ’70s, but it still remains a problem. The 2010 Super Bowl drew one of the most diverse audiences ever recorded for the event, including a 45 percent female audience. Yet the commercials remained focused strictly on men. And of 67 ads shown during the game, only four showed minority actors in a lead role. Despite the obvious economic benefit of diversity in marketing, advertising practices have resisted change (Ali, 2010).
Advertising to Children
The majority of advertisements that target children feature either toys or junk food. Children under the age of eight typically lack the ability to distinguish between fantasy and reality, and many advertisers use this to their advantage. Studies have shown that most children-focused food advertisements feature high-calorie, low-nutrition foods such as sugary cereals. Although the government regulates advertising to children to a degree, the Internet has introduced new means of marketing to youth that have not been addressed. Online video games called advergames feature famous child-oriented products. The games differ from traditional advertising, however, because the children playing them will experience a much longer period of product exposure than they do from the typical 30-second TV commercial. Child advocacy groups have been pushing for increased regulation of advertising to children, but it remains to be seen whether this will take place (Calvert, 2008).
Positive Effects of Advertising
Although many people focus on advertising’s negative outcomes, the medium has provided unique benefits over time. Early newspaper advertising allowed newspapers to become independent of church and government control, encouraging the development of a free press with the ability to criticize powerful interests. When newspapers and magazines moved to an advertising model, these publications became accessible to large groups of people who previously could not afford them. Advertising also contributed to radio’s and TV’s most successful eras. Radio’s golden age in the 1940s and TV’s golden age in the 1950s both took place when advertisers were creating or heavily involved with the production of most of the programs.
Advertising also makes newer forms of media both useful and accessible. Many Internet services, such as e-mail and smartphone applications, are only free because they feature advertising. Advertising allows promoters and service providers to reduce and sometimes eliminate the upfront purchase price, making these services available to a greater number of people and allowing lower economic classes to take part in mass culture.
Advertising has also been a longtime promoter of the arts. During the Renaissance, painters and composers often relied on wealthy patrons or governments to promote their work. Corporate advertising has given artists new means to fund their creative efforts. In addition, many artists and writers have been able to support themselves by working for advertisers. The use of music in commercials, particularly in recent years, has provided musicians with notoriety and income. Indeed, it is hard to imagine the cultural landscape of the United States without advertising.
Key Takeaways
- Advertising has existed since ancient times but began to take on its modern form during the Age of Exploration. By the 19th century, newspapers and magazines had begun printing advertising to generate needed revenues.
- Although many considered advertising to be lowbrow or immoral up to the early 20th century, it saw greater acceptance after its use during World War I to encourage support for the war.
- Up to the mid-20th century, advertising featured stereotypical affluent White families and was sale driven. Criticism of advertiser manipulation became the basis of a new style of advertising during the creative revolution of the 1960s.
- The rise of the Internet has caused print advertising revenues to decline but allows for personally targeted ads. Such tracking practices have aroused concern from privacy groups.
- The Federal Trade Commission is charged with ensuring that advertisements make verifiable claims and do not overtly mislead consumers.
- Advertising has infused American culture with mass images and ideas, creating a nation of consumers and shaping how people view themselves and others.
Exercises
Please answer the following short-answer questions. Each response should be a minimum of one paragraph.
- What are the important eras in the history of American advertising?
- How does government regulation affect advertising?
- What types of advertising are in use today?
- What influence does advertising have on American consumerism and culture?
- How has advertising affected newspapers?
1Mierau, 42.
2Hood, 28–51.
3Fox, 74–77.
4Fox, 89.
5Fox, 121–124.
6Fox, 168.
7Fox, 173.
8Fox, 210–215.
9Frank, 60–67, 159.
10Fox, 324–325.
11Klein, 12–22.
12Fox, 65–66.
13Hood, 74–75.
14O’Guinn, Allen, and Semenik, 131–137.
15Fox, 95–96.
16Frank, 233–235.
17Fox, 278–284.
References
Ali, Sam. “New Study: Super Bowl Ads Created by White Men,” DiversityInc.com, May 10, 2010., http://www.diversityinc.com/article/7566/New-Study-Super-Bowl-Ads-Created-by-White-Men/.
All Classic Ads, “Advertising Timeline,” Vintage Collection, All Classic Ads, http://www.allclassicads.com/advertising-timeline.html.
Anderson, Nate. “Product placement in the DVR era,” Ars Technica (blog), March 19, 2006, http://arstechnica.com/gadgets/news/2006/03/productplacement.ars.
Applegate, Edd. Personalities and Products: A Historical Perspective on Advertising in America (Westport, CT: Greenwood Press, 1998), 57–64.
Ashenburg, Katherine. The Dirt on Clean: An Unsanitized History (Toronto: Vintage Canada, 2008), 245–247.
Boddy, William. “The Seven Dwarfs and the Money Grubbers: The Public Relations Crisis of US Television in the Late 1950s,” in Logics of Television: Essays in Cultural Criticism, ed. Patricia Mellencamp (Bloomington, IN: Indiana University Press, 1990), 110.
Burgan, Read G.“Radio Fun with Fibber McGee and Molly,” RGB Digital Audio, January 24, 1996, http://www.rgbdigitalaudio.com/OTR_Reviews/Fibber_McGee_OTRArticle.htm.
Calvert, Sandra. “Children as Consumers: Advertising and Marketing,” The Future of Children 18, no. 1 (Spring 2008): 205–211.
ConsumerAffairs.com, “Feds Slam ‘Height-Enhancing’ Pills,” November 29, 2006, http://www.consumeraffairs.com/news04/2006/11/ftc_chitosan.html.
Dictionary.com, s.v. “Advertising,” http://dictionary.reference.com/browse/advertising.
Fairlee, Rik. “Smartphone Users Go for Location-Based Apps,” PC Magazine, May 18, 2010, http://www.pcmag.com/article2/0,2817,2363899,00.asp.
Fox Business, “Old Spice and E*TRADE Ads Provide Lessons in Viral Marketing,” March 17, 2010, http://www.foxbusiness.com/story/markets/industries/finance/old-spice-etrade-ads-provide-lessons-viral-marketing/.
Fox, Stephen. The Mirror Makers (New York: William Morrow, 1984), 41–46.
Frank, Thomas. The Conquest of Cool (Chicago: University of Chicago Press, 1998), 41.
Frauenfelder, Mark. “Creepy Slacks Ad From 1970,” Boing Boing, (blog), May 12, 2008, http://boingboing.net/2008/05/12/creepy-slacks-ad-fro.html.
Friedman, Ted. “Apple’s 1984: The Introduction of the Macintosh in the Cultural History of Personal Computers,” http://www.duke.edu/~tlove/mac.htm.
Friedman, Wayne. “Product Placement in Kids’ TV Programs: Stuff Your Footwear Can Slip On,” TV Watch, September 16, 2010, http://www.mediapost.com/publications/?fa=Articles.showArticle&art_aid=135873.
Gallagher, James. “Duke Study: TiVo Doesn’t Hurt TV Advertising,” Triangle Business Journal, May 3, 2010, 20advertisinghttp://www.bizjournals.com/triangle/stories/2010/05/03/daily6.html.
Gardner, Amanda. “Alcohol Companies Use New Media to Lure Young Drinkers: Report,” Bloomberg BusinessWeek, May 19, 2010, http://www.businessweek.com/lifestyle/content/healthday/639266.html.
Gartrell, Ellen. “More About Early Advertising Publications,” Digital Collections, Duke University Libraries, http://library.duke.edu/digitalcollections/eaa/printlit.html.
Hood, John. Selling the Dream: Why Advertising Is Good Business (Westport, CT: Praeger, 2005), 12–13.
Hornblower, Margot. “Wake Up and Smell the Protest,” Time, April 17, 2000.
Ionescu, Daniel. “Google Billboard Ads Gun for Microsoft and Promote Google Apps,” PC World, August 3, 2009, http://www.pcworld.com/article/169475/google_billboard_ads_gun_for_microsoft_and_promote_google_apps.html.
JackBennyShow.com, “Jell-O,” Jack Benny Show, http://jackbennyshow.com/index_090.htm.
Klein, Naomi. No Logo (New York: Picador, 2002), 14.
Lapidos, Juliet. “Will My Plastic Bag Still Be Here in 2507?” Slate, June 27, 2007, http://www.slate.com/id/2169287.
Lewis, Sinclair. Babbitt (New York: Harcourt, Brace, and Co., 1922), 95.
Marchand, Roland. Advertising the American Dream: Making Way for Modernity, 1920–1940 (Berkeley: University of California Press, 1985), 7–9.
Marketwire, “TargetSpot Enters the Mobile Advertising Market,” news release, SmartBrief, February 23, 2010, http://www.smartbrief.com/news/aaaa/industryMW-detail.jsp?id=4217DD5E-932F-460E-BE30-4988E17DEFEC.
McAloney, Curt. “The 1984 Apple Commercial: The Making of a Legend,” Curt’s Media, http://www.curtsmedia.com/cine/1984.html.
Mierau, Christina B. Accept No Substitutes: The History of American Advertising (Minneapolis, MN: Lerner, 2000), 7–8.
Neff, Jack. “How Much Old Spice Body Wash Has the Old Spice Guy Sold?” AdvertisingAge, July 26, 2010, http://adage.com/article?article_id=145096.
New York Times, “How MTV Has Rocked Television Commercials,” October 9, 1989, http://www.nytimes.com/1989/10/09/business/the-media-business-how-mtv-has-rocked-television-commercials.html.
O’Barr, William M. “A Brief History of Advertising in America,” Advertising & Society Review 6, no. 3 (2005), http://muse.jhu.edu/journals/asr/v006/6.3unit02.html.
O’Guinn, Thomas, Chris Allen, and Richard Semenik, Advertising and Integrated Brand Promotion (Mason, OH: Cengage Learning, 2009), 133.
Rodnitzky, Jerome. Feminist Phoenix: The Rise and Fall of a Feminist Counterculture (Westport, CT: Praeger, 1999), 114–115.
Samuel, Lawrence. Brought to You By: Postwar Television Advertising and the American Dream (Austin, TX: University of Texas Press, 2001), 88–94.
Schneider, Keith. “Guides on Environmental Ad Claims,” New York Times, July 29, 1992, http://www.nytimes.com/1992/07/29/business/guides-on-environmental-ad-claims.html.
Shields, Mike. “Pitching Self-Regulation,” Adweek, February 15, 2010.
Stansky, Tanner. “14 Milestones in TV Product Placement,” Entertainment Weekly, July 28, 2008, http://www.ew.com/ew/article/0,,20215225,00.html.
Sterling, Bruce. “More Newspaper Calamity,” Wired, March 15, 2010, http://www.wired.com/beyond_the_beyond/2010/03/more-newspaper-calamity/.
Time, “The Press: Advertising v. New Deal,” September 1, 1941, http://www.time.com/time/magazine/article/0,9171,850703,00.html.
Vance, Jennifer. “Extra, Extra, Read All About It!” Penny Press, http://iml.jou.efl.edu/projects/Spring04/vance/pennypress.html.
12.2 Public Relations
Learning Objectives
- Describe the four models of public relations and the four stages of a typical public relations campaign.
- Analyze the role of public relations in media and culture.
- Analyze the ways public relations is taking the place of traditional advertising.
- Explain the concept of branding.
- Describe the uses of public relations in politics, government, and news media.
Whereas advertising is the paid use of media space to sell something, public relations (PR) is the attempt to establish and maintain good relations between an organization and its constituents (Theaker, 2004). Practically, PR campaigns strive to use the free press to encourage favorable coverage. In their book The Fall of Advertising and the Rise of PR, Al and Laura Ries make the point that the public trusts the press far more than they trust advertisements. Because of this, PR efforts that get products and brands into the press are far more valuable than a simple advertisement. Their book details the ways in which modern companies use public relations to far greater benefit than they use advertising (Ries & Ries, 2004). Regardless of the fate of advertising, PR has clearly come to have an increasing role in marketing and ad campaigns.
The Four Models of PR
Todd Hunt and James Grunig developed a theory of four models of PR. This model has held up in the years since its development and is a good introduction to PR concepts (Grunig & Hunt, 1984).
Traditional Publicity Model
Under the traditional publicity model, PR professionals seek to create media coverage for a client, product, or event. These efforts can range from wild publicity stunts to simple news conferences to celebrity interviews in fashion magazines. P. T. Barnum was an early American practitioner of this kind of PR. His outrageous attempts at publicity worked because he was not worried about receiving negative press; instead, he believed that any coverage was a valuable asset. More recent examples of this style of extreme publicity include controversy-courting musicians such as Lady Gaga and Marilyn Manson. More restrained examples of this type of PR include the modern phenomenon of faded celebrities appearing on TV shows, such as Paula Abdul’s long-running appearances on American Idol.
Public Information Model
The goal of the public information model is to release information to a constituency. This model is less concerned with obtaining dramatic, extensive media coverage than with disseminating information in a way that ensures adequate reception. For example, utility companies often include fliers about energy efficiency with customers’ bills, and government agencies such as the IRS issue press releases to explain changes to existing codes. In addition, public interest groups release the results of research studies for use by policy makers and the public.
Persuasive Communication: Two-Way Asymmetric
The persuasive communication model, or the two-way asymmetric, works to persuade a specific audience to adopt a certain behavior or point of view. To be considered effective, this model requires a measured response from its intended audience.
Government propaganda is a good example of this model. Propaganda is the organized spreading of information to assist or weaken a cause (Dictionary). Edward Bernays has been called the founder of modern PR for his work during World War I promoting the sale of war bonds. One of the first professional PR experts, Bernays made the two-way asymmetric model his early hallmark. In a famous campaign for Lucky Strike cigarettes, he convinced a group of well-known celebrities to walk in the New York Easter parade smoking Lucky Strikes. Most modern corporations employ the persuasive communication model.
Two-Way Symmetric Model
The two-way symmetric model requires true communication between the parties involved. By facilitating a back-and-forth discussion that results in mutual understanding and an agreement that respects the wishes of both parties, this PR model is often practiced in town hall meetings and other public forums in which the public has a real effect on the results. In an ideal republic, Congressional representatives strictly employ this model. Many nonprofit groups that are run by boards and have public service mandates use this model to ensure continued public support.
Commercial ventures also rely on this model. PR can generate media attention or attract customers, and it can also ease communication between a company and its investors, partners, and employees. The two-way symmetric model is useful in communicating within an organization because it helps employees feel they are an important part of the company. Investor relations are also often carried out under this model.
PR Functions
Either private PR companies or in-house communications staffers carry out PR functions. A PR group generally handles all aspects of an organization’s or individual’s media presence, including company publications and press releases. Such a group can range from just one person to dozens of employees depending on the size and scope of the organization.
PR functions include the following:
- Media relations: takes place with media outlets
- Internal communications: occurs within a company between management and employees, and among subsidiaries of the same company
- Business-to-business: happens between businesses that are in partnership
- Public affairs: takes place with community leaders, opinion formers, and those involved in public issues
- Investor relations: occurs with investors and shareholders
- Strategic communication: intended to accomplish a specific goal
- Issues management: keeping tabs on public issues important to the organization
- Crisis management: handling events that could damage an organization’s image1
Anatomy of a PR Campaign
PR campaigns occur for any number of reasons. They can be a quick response to a crisis or emerging issue, or they can stem from a long-term strategy tied in with other marketing efforts. Regardless of its purpose, a typical campaign often involves four phases.
Initial Research Phase
The first step of many PR campaigns is the initial research phase. First, practitioners identify and qualify the issue to be addressed. Then, they research the organization itself to clarify issues of public perception, positioning, and internal dynamics. Strategists can also research the potential audience of the campaign. This audience may include media outlets, constituents, consumers, and competitors. Finally, the context of the campaign is often researched, including the possible consequences of the campaign and the potential effects on the organization. After considering all of these factors, practitioners are better educated to select the best type of campaign.
Strategy Phase
During the strategy phase, PR professionals usually determine objectives focused on the desired goal of the campaign and formulate strategies to meet those objectives. Broad strategies such as deciding on the overall message of a campaign and the best way to communicate the message can be finalized at this time.
Tactics Phase
During the tactics phase, the PR group decides on the means to implement the strategies they formulated during the strategy phase. This process can involve devising specific communication techniques and selecting the forms of media that suit the message best. This phase may also address budgetary restrictions and possibilities.
Evaluation Phase
After the overall campaign has been determined, PR practitioners enter the evaluation phase. The group can review their campaign plan and evaluate its potential effectiveness. They may also conduct research on the potential results to better understand the cost and benefits of the campaign. Specific criteria for evaluating the campaign when it is completed are also established at this time (Smith, 2002).
Examples of PR Campaigns
Since its modern inception in the early 20th century, PR has turned out countless campaigns—some highly successful, others dismal failures. Some of these campaigns have become particularly significant for their lasting influence or creative execution. This section describes a few notable PR campaigns over the years.
Diamonds for the Common Man
During the 1930s, the De Beers company had an enormous amount of diamonds and a relatively small market of luxury buyers. They launched a PR campaign to change the image of diamonds from a luxury good into an accessible and essential aspect of American life. The campaign began by giving diamonds to famous movie stars, using their built-in publicity networks to promote De Beers. The company created stories about celebrity proposals and gifts between lovers that stressed the size of the diamonds given. These stories were then given out to selected fashion magazines. The result of this campaign was the popularization of diamonds as one of the necessary aspects of a marriage proposal (Reid, 2006).
Big Tobacco Aids Researchers
In 1953, studies showing the detrimental health effects of smoking caused a drop in cigarette sales. An alliance of tobacco manufacturers hired the PR group Hill & Knowlton to develop a campaign to deal with this problem. The first step of the campaign Hill & Knowlton devised was the creation of the Tobacco Industry Research Committee (TIRC) to promote studies that questioned the health effects of tobacco use. The TIRC ran advertisements featuring the results of these studies, giving journalists who were addressing the subject an easy source to quote. The groups working against smoking were not familiar with media relations, making it harder for journalists to quote them and use their arguments.
The campaign was effective, however, not because it denied the harmful effects of smoking but because it stressed the disagreements between researchers. By providing the press with information favorable to the tobacco manufacturers and publicly promoting new filtered cigarettes, the campaign aimed to replace the idea that smoking was undeniably bad with the idea that there was disagreement over the effects of smoking. This strategy served tobacco companies well up through the 1980s.
Taco Bell Targets Mir
When the Russian space station Mir was set to crash land in the Pacific Ocean in 2001, Taco Bell created a floating vinyl target that the company placed in the Pacific. Taco Bell promised to give every American a free taco if the space station hit the target. This simple PR stunt gave all the journalists covering the Mir crash landing a few lines to add to their stories. Scientists even speculated on the chances of the station hitting the target—slim to none. Ultimately, the stunt gained Taco Bell global advertising (BBC World, 2001).
PR as a Replacement for Advertising
In some cases, PR has begun overtaking advertising as the preferred way of promoting a particular company or product. For example, the tobacco industry offers a good case study of the migration from advertising to PR. Regulations prohibiting radio and TV cigarette advertisements had an enormous effect on sales. In response, the tobacco industry began using PR techniques to increase brand presence.
Tobacco company Philip Morris started underwriting cultural institutions and causes as diverse as the Joffrey Ballet, the Smithsonian, environmental awareness, and health concerns. Marlboro sponsored events that brought a great deal of media attention to the brand. For example, during the 1980s, the Marlboro Country Music Tour took famous country stars to major coliseums throughout the country and featured talent contests that brought local bands up on stage, increasing the audience even further. Favorable reviews of the shows generated positive press for Marlboro. Later interviews with country artists and books on country music history have also mentioned this tour.
On the fifth anniversary of the Vietnam Veterans Memorial in 1987, Marlboro’s PR groups organized a celebration hosted by comedian Bob Hope. Country music legends the Judds and Alabama headlined the show, and Marlboro paid for new names inscribed on the memorial. By attaching the Marlboro brand to such an important cultural event, the company gained an enormous amount of publicity. Just as importantly, these efforts at least partially restored the stature that the brand lost due to health concerns (Saffir, 2000).
Branding
While advertising is an essential aspect of initial brand creation, PR campaigns are vital to developing the more abstract aspects of a brand. These campaigns work to position a brand in the public arena in order to give it a sense of cultural importance.
Shift From Advertising to PR
Pioneered by such companies as Procter & Gamble during the 1930s, the older, advertising-centric model of branding focused on the product, using advertisements to associate a particular branded good with quality or some other positive cultural value. Yet, as consumers became exposed to ever-increasing numbers of advertisements, traditional advertising’s effectiveness dwindled. The ubiquity of modern advertising means the public is skeptical of—or even ignores—claims advertisers make about their products. This credibility gap can be overcome, however, when PR professionals using good promotional strategies step in.
The new PR-oriented model of branding focuses on the overall image of the company rather than on the specific merits of the product. This branding model seeks to associate a company with specific personal and cultural values that hold meaning for consumers. In the early 1990s, for example, car company Saturn marketed its automobiles not as a means of transportation but as a form of culture. PR campaigns promoted the image of the Saturn family, associating the company with powerful American values and giving Saturn owners a sense of community. Events such as the 1994 Saturn homecoming sought to encourage this sense of belonging. Some 45,000 people turned out for this event; families gave up their beach holidays simply to come to a Saturn manufacturing plant in Tennessee to socialize with other Saturn owners and tour the facility.
Recently Toyota faced a marketing crisis when it instituted a massive recall based on safety issues. To counter the bad press, the company launched a series of commercials featuring top Toyota executives, urging the public to keep their faith in the brand (Bernstein, 2010). Much like the Volkswagen ads half a century before, Toyota used a style of self-awareness to market its automobiles. The positive PR campaign presented Toyotas as cars with a high standard of excellence, backed by a company striving to meet customers’ needs.
Studies in Success: Apple and Nike
Apple has also employed this type of branding with great effectiveness. By focusing on a consistent design style in which every product reinforces the Apple experience, the computer company has managed to position itself as a mark of individuality. Despite the cynical outlook of many Americans regarding commercial claims, the notion that Apple is a symbol of individualism has been adopted with very little irony. Douglas Atkin, who has written about brands as a form of cult, readily admits and embraces his own brand loyalty to Apple:
I’m a self-confessed Apple loyalist. I go to a cafe around the corner to do some thinking and writing, away from the hurly-burly of the office, and everyone in that cafe has a Mac. We never mention the fact that we all have Macs. The other people in the cafe are writers and professors and in the media, and the feeling of cohesion and community in that cafe becomes very apparent if someone comes in with a PC. There’s almost an observable shiver of consternation in the cafe, and it must be discernable to the person with the PC, because they never come back.
Brand managers that once focused on the product now find themselves in the role of community leaders, responsible for the well-being of a cultural image (Atkin, 2004).
Kevin Roberts, the current CEO of Saatchi & Saatchi Worldwide, a branding-focused creative organization, has used the term “lovemark” as an alternative to trademark. This term encompasses brands that have created “loyalty beyond reason,” meaning that consumers feel loyal to a brand in much the same way they would toward friends or family members. Creating a sense of mystery around a brand generates an aura that bypasses the usual cynical take on commercial icons. A great deal of Apple’s success comes from the company’s mystique. Apple has successfully developed PR campaigns surrounding product releases that leak selected rumors to various press outlets but maintain secrecy over essential details, encouraging speculation by bloggers and mainstream journalists on the next product. All this combines to create a sense of mystery and an emotional anticipation for the product’s release.
Emotional connections are crucial to building a brand or lovemark. An early example of this kind of branding was Nike’s product endorsement deal with Michael Jordan during the 1990s. Jordan’s amazing, seemingly magical performances on the basketball court created his immense popularity, which was then further built up by a host of press outlets and fans who developed an emotional attachment to Jordan. As this connection spread throughout the country, Nike associated itself with Jordan and also with the emotional reaction he inspired in people. Essentially, the company inherited a PR machine that had been built around Jordan and that continued to function until his retirement (Roberts, 2003).
Branding Backlashes
An important part of maintaining a consistent brand is preserving the emotional attachment consumers have to that brand. Just as PR campaigns build brands, PR crises can damage them. For example, the massive Gulf of Mexico oil spill in 2010 became a PR nightmare for BP, an oil company that had been using PR to rebrand itself as an environmentally friendly energy company.
In 2000, BP began a campaign presenting itself as “Beyond Petroleum,” rather than British Petroleum, the company’s original name. By acquiring a major solar company, BP became the world leader in solar production and in 2005 announced it would invest $8 billion in alternative energy over the following 10 years. BP’s marketing firm developed a PR campaign that, at least on the surface, emulated the forward-looking two-way symmetric PR model. The campaign conducted interviews with consumers, giving them an opportunity to air their grievances and publicize energy policy issues. BP’s website featured a carbon footprint calculator consumers could use to calculate the size of their environmental impact (Solman, 2008). The single explosion on BP’s deep-water oil rig in the Gulf of Mexico essentially nullified the PR work of the previous 10 years, immediately putting BP at the bottom of the list of environmentally concerned companies.
A company’s control over what its brand symbolizes can also lead to branding issues. The Body Shop, a cosmetics company that gained popularity during the 1980s and early 1990s, used PR to build its image as a company that created natural products and took a stand on issues of corporate ethics. The company teamed up with Greenpeace and other environmental groups to promote green issues and increase its natural image.
By the mid-1990s, however, revelations about the unethical treatment of franchise owners called this image into serious question. The Body Shop had spent a great deal of time and money creating its progressive, spontaneous image. Stories of travels to exotic locations to research and develop cosmetics were completely fabricated, as was the company’s reputation for charitable contributions. Even the origins of the company had been made up as a PR tool: The idea, name, and even product list had been ripped off from a small California chain called the Body Shop that was later given a settlement to keep quiet. The PR campaign of the Body Shop made it one of the great success stories of the early 1990s, but the unfounded nature of its PR claims undermined its image dramatically. Competitor L’Oréal eventually bought the Body Shop for a fraction of its previous value (Entine, 2007).
Other branding backlashes have plagued companies such as Nike and Starbucks. By building their brands into global symbols, both companies also came to represent unfettered capitalist greed to those who opposed them. During the 1999 World Trade Organization protests in Seattle, activists targeted Starbucks and Nike stores for physical attacks such as window smashing. Labor activists have also condemned Nike over the company’s use of sweatshops to manufacture shoes. Eventually, Nike created a vice president for corporate responsibility to deal with sweatshop issues.2
Blackspot: The Antibrand Brand
Adbusters, a publication devoted to reducing advertising’s influence on global culture, added action to its criticisms of Nike by creating its own shoe. Manufactured in union shops, Blackspot shoes contain recycled tire rubber and hemp fabric. The Blackspot logo is a simple round dot that looks like it has been scribbled with white paint, as if a typical logo had been covered over. The shoes also include a symbolic red dot on the toe with which to kick Nike. Blackspot shoes use the Nike brand to create their own antibrand, symbolizing progressive labor reform and environmentally sustainable business practices (New York Times, 2004).
Relationship With Politics and Government
Politics and PR have gone hand in hand since the dawn of political activity. Politicians communicate with their constituents and make their message known using PR strategies. Benjamin Franklin’s trip as ambassador to France during the American Revolution stands as an early example of political PR that followed the publicity model. At the time of his trip, Franklin was an international celebrity, and the fashionable society of Paris celebrated his arrival; his choice of a symbolic American-style fur cap immediately inspired a new style of women’s wigs. Franklin also took a printing press with him to produce leaflets and publicity notices that circulated through Paris’s intellectual and fashionable circles. Such PR efforts eventually led to a treaty with France that helped the colonists win their freedom from Great Britain (Isaacson, 2003).
Famous 20th-century PR campaigns include President Franklin D. Roosevelt’s Fireside Chats, a series of radio addresses that explained aspects of the New Deal. Roosevelt’s personal tone and his familiarity with the medium of radio helped the Fireside Chats become an important promotional tool for his administration and its programs. These chats aimed to justify many New Deal policies, and they helped the president bypass the press and speak directly to the people. More recently, Blackwater Worldwide, a private military company, dealt with criticisms of its actions in Iraq by changing its name. The new name, Xe Services, was the result of a large-scale PR campaign to distance the company from associations with civilian violence (Associated Press, 2009).
The proliferation of media outlets and the 24-hour news cycle have led to changes in the way politicians handle PR. The gap between old PR methods and new ones became evident in 2006, when then–Vice President Dick Cheney accidentally shot a friend during a hunting trip. Cheney, who had been criticized in the past for being secretive, did not make a statement about the accident for three days. Republican consultant Rich Galen explained Cheney’s silence as an older PR tactic that tries to keep the discussion out of the media. However, the old trick is less effective in the modern digital world.
That entire doctrine has come and gone. Now the doctrine is you respond instantaneously, and where possible with a strong counterattack. A lot of that is because of the Internet, a lot of that is because of cable TV news (Associated Press, 2006).
PR techniques have been used in propaganda efforts throughout the 20th century. During the 1990s, the country of Kuwait employed Hill & Knowlton to encourage U.S. involvement in the Persian Gulf region. One of the more infamous examples of their campaign was a heavily reported account by a Kuwaiti girl testifying that Iraqi troops had dumped babies out of incubators in Kuwaiti hospitals. Outrage over this testimony helped galvanize opinion in favor of U.S. involvement. As it turned out, the Kuwaiti girl was really the daughter of the Kuwaiti ambassador and had not actually witnessed any of the alleged atrocities (Parsons, 2005).
Lobbyists also attempt to influence public policy using PR campaigns. The Water Environment Federation, a lobbying group representing the sewage industry, initiated a campaign to promote the application of sewage on farms during the early 1990s. The campaign came up with the word biosolids to replace the term sludge. Then it worked to encourage the use of this term as a way to popularize sewage as a fertilizer, providing information to public officials and representatives. In 1992, the U.S. Environmental Protection Agency adopted the new term and changed the classification of biosolids to a fertilizer from a hazardous waste. This renaming helped New York City eliminate tons of sewage by shipping it to states that allowed biosolids (Stauber & Rampton, 1995).
Political Branding
Politics has also embraced branding. Former President Bill Clinton described his political battles in terms of a brand war:
[The Republicans] were brilliant at branding. They said they were about values…. Everybody is a values voter, but they got the brand…they said they were against the death tax…what a great brand…. I did a disservice to the American people not by putting forth a bad plan, but by not being a better brander, not being able to explain it better (Kiley, 2008).
Branding has been used to great effect in recent elections. A consistently popular political brand is that of the outsider, or reform-minded politician. Despite his many years of service in the U.S. Senate, John McCain famously adopted this brand during the 2008 presidential election. McCain’s competitor, Barack Obama, also employed branding strategies. The Obama campaign featured several iconic portraits and slogans that made for a consistent brand and encouraged his victory in 2008. Before Obama’s inauguration in January 2009, an unprecedented amount of merchandise was sold, a further testament to the power of branding (Alberts, 2009).
Branding as a New Form of Communication
That so many different groups have adopted branding as a means of communication is a testament to its ubiquity. Even anticommercial, antibrand groups such as Adbusters have created brands to send messages. Social media sites have also encouraged branding techniques by allowing users to create profiles of themselves that they use to communicate their core values. This personal application is perhaps the greatest evidence of the impact of advertising and PR on modern culture. Branding, once a technique used by companies to sell their products, has become an everyday means of communication.
Key Takeaways
- The four models of PR include traditional publicity, public information, persuasive communication, and two-way symmetrical models.
- PR campaigns begin with a research phase, develop objectives during a strategy phase, formulate ways to meet objectives during the tactics phase, and assess the proposed campaign during the evaluation phase.
- Branding focuses on the lifestyles and values inherent in a brand’s image as opposed to the products that are manufactured. It can be quickly undone by PR crises such as the BP oil spill.
- PR has always been an important part of political campaigning and activity. In recent years, branding has become an important part of national political campaigns.
Exercises
Please answer the following short-answer questions. Each response should be a minimum of one paragraph.
- What are the four models of PR and the four stages of a typical PR campaign?
- Analyze the role (or roles) of PR in media and culture.
- In what ways is PR taking the place of traditional advertising?
- What is branding and how is it important to PR?
- How is PR used in the news media?
- In which ways is PR used in politics?
End-of-Chapter Assessment
Review Questions
-
Section 1
- How did advertising shape early consumer culture during the 1920s?
- Explain how government legislation has regulated advertisements and their claims.
- How did the creative revolution of the 1960s change advertising?
- How did the multiple-sponsor format change TV?
- Give an example of a digital media format and explain how it has incorporated advertising.
-
Section 2
- What PR model did P. T. Barnum utilize the most?
- Which phase of a PR campaign involves creating objectives?
- What was the focus of BP’s PR campaign before the 2010 oil spill?
- What are some of the key components of Apple’s branding strategy?
- How would you describe Barack Obama’s brand during the 2008 presidential campaign?
Critical Thinking Questions
- Do you think that government regulation of advertising is justified? Explain your answer.
- In your opinion, would most Americans give up their privacy in order to retain free, advertiser-supported services such as e-mail? Explain your answer.
- Do you think that print media can survive without traditional forms of advertising? Explain your answer.
- How do you think branding has affected American culture?
- How has branding affected political discourse in the United States?
Career Connection
Advertising has had an enormous influence on the ways that people present and imagine themselves. Personal branding has become an industry, with consultants and coaches ready to help anyone find his or her own brand. Creating a personal brand is a useful way to assess your skills and feelings about the advertising or PR professions.
Research the term personal brand using a search engine. Look for strategies that would help you construct your own brand. Imagine that you are a brand and describe what that brand offers. This does not need to be limited to professional capacities, but should represent your personal philosophy and life experiences. In 15 words or less, write a description of your brand.
Answer the following questions about your brand description:
- How well does the brand description capture your personality?
- How appealing do you think this brand would be to potential employers?
- How appealing do you think this brand would be to potential friends?
- Are you comfortable with the idea of promoting your own brand? Explain your answer.
- How do you think the personal branding process is different from or similar to the corporate branding process?
1Theaker, 7.
2Klein, 366.
References
Alberts, Sheldon. “Brand Obama,” Financial Post, January 17, 2009, http://www.financialpost.com/m/story.html?id=1191405.
Associated Press, “Blackwater Ditches Tarnished Brand Name,” USA Today, February 13, 2009, http://www.usatoday.com/news/military/2009-02-13-blackwater_N.htm.
Associated Press, “Cheney Hunting Accident Seen as P.R. Disaster,” MSNBC, February 16, 2006, http://www.msnbc.msn.com/id/11396608/ns/politics/.
Atkin, Douglas. interview, Frontline, PBS, February 2, 2004, http://www.pbs.org/wgbh/pages/frontline/shows/persuaders/interviews/atkin.html.
BBC World, “Taco Bell Cashes in on Mir,” March 20, 2001, http://news.bbc.co.uk/2/hi/americas/1231447.stm.
Bernstein, Sharon. “Toyota faces a massive marketing challenge,” Los Angeles Times, February 9, 2010, http://articles.latimes.com/2010/feb/09/business/la-fi-toyota-marketing10-2010feb10.
Dictionary.com, s.v. “Propaganda,” http://dictionary.reference.com/browse/propaganda.
Entine, Jon. “Queen of Green Roddick’s ‘Unfair Trade’ Started When She Copied Body Shop Formula,” Daily Mail (London), September 15, 2007, http://www.dailymail.co.uk/femail/article-482012/Queen-Green-Roddicks-unfair-trade-started-copied-Body-Shop-formula.html.
Grunig, James E. and Todd Hunt, Managing Public Relations, 1984 (Belmont, CA: Wadsworth Publishing).
Isaacson, Walter. Benjamin Franklin: An American Life (New York: Simon & Schuster, 2003), 325–349.
Kiley, David. “How Will Bill Clinton Manage His Brand?” BusinessWeek, June 10, 2008, analysishttp://www.businessweek.com/bwdaily/dnflash/content/jun2008/db2008069_046398.htm.
New York Times, “Nat Ives, “Anti-Ad Group Tries Advertising,” New York Times, September 21, 2004, http://www.nytimes.com/2004/09/21/business/media/21adco.html.
Parsons, Patricia. Ethics in Public Relations (Sterling, VA: Chartered Institute of Public Relations, 2005), 7.
Reid, Stuart. “The Diamond Myth,” Atlantic, http://www.theatlantic.com/magazine/archive/2006/12/the-diamond-myth/5491/.
Ries, Al and Laura Ries, The Fall of Advertising and the Rise of PR (New York: HarperBusiness, 2004), 90.
Roberts, Kevin. interview, Frontline, PBS, December 15, 2003, http://www.pbs.org/wgbh/pages/frontline/shows/persuaders/interviews/roberts.html.
Saffir, Leonard. Power Public Relations: How to Master the New PR (Lincolnwood, IL: NTC Contemporary, 2000), 77–88.
Smith, Ronald. Strategic Planning for Public Relations (Mahwah, NJ: Erlbaum Associates, 2002), 9–11.
Solman, Gregory. “BP: Coloring Public Opinion?” Adweek, January 14, 2008, 1http://www.adweek.com/aw/content_display/news/strategy/e3i9ec32f006d17a91cd72d6192b9f7599a.
Stauber, John and Sheldon Rampton, Toxic Sludge is Good for You! (Monroe, ME: Common Courage Press, 1995), 105–119.
Theaker, Alison. The Public Relations Handbook (Oxfordshire, England: Routledge, 2004), 4.