8.2 The History of Movies

Learning Objectives

  1. Identify key points in the development of the motion picture industry.
  2. Identify key developments of the motion picture industry and technology.
  3. Identify influential films in movie history.

The movie industry originated in the early 19th century through a series of technological developments: the creation of photography, the discovery of the illusion of motion by combining individual still images, and the study of human and animal locomotion. The history presented here begins at the culmination of these technological developments, where the idea of the motion picture as an entertainment industry first emerged. Since then, the industry has seen extraordinary transformations, some driven by the artistic visions of individual participants, some by commercial necessity, and still others by accident. The complexity of cinema’s history prohibits this book from mentioning every important innovator and movement. Nonetheless, this chapter will help readers understand the broad arc of the development of a medium that has captured the imaginations of audiences worldwide for over a century.

The Beginnings: Motion Picture Technology of the Late 19th Century

While the experience of watching movies on smartphones may seem like a drastic departure from the communal nature of modern film viewing, in some ways the small-format, single-viewer screen display recaptures film’s early roots. In 1891, the inventor Thomas Edison, together with his assistant William Dickson came out with what they called the kinetoscope, a device that would become the predecessor to the motion picture projector. The kinetoscope consisted of a cabinet with a window through which individual viewers could experience the illusion of a moving image (Gale Virtual Reference Library) (British Movie Classics). A perforated celluloid film strip with a sequence of images on it was rapidly spooled between a light bulb and a lens, creating the illusion of motion (Britannica). The images viewers could see in the kinetoscope captured events and performances that Edison staged at his film studio in East Orange, New Jersey, especially for the Edison kinetograph (the camera that produced kinetoscope film sequences): circus performances, dancing women, cockfights, boxing matches, and even a tooth extraction by a dentist (Robinson, 1994).

The Edison kinetoscope.
The Edison kinetoscope. Source: todd.vision – Kinetoscope – CC BY 2.0.

As the kinetoscope gained popularity, the Edison Company began installing machines in hotel lobbies, amusement parks, and penny arcades, and soon kinetoscope parlors—where customers could pay around 25 cents for admission to a bank of machines—had opened around the country. However, when friends and collaborators suggested that Edison find a way to project his kinetoscope images for audience viewing, he apparently refused, claiming that such an invention would make less profit (Britannica).

Because Edison hadn’t secured an international patent for his invention, variations of the kinetoscope soon appeared throughout Europe. This new form of entertainment achieved instant success, and many mechanics and inventors, seeing an opportunity, began toying with methods of projecting moving images onto a larger screen. However, the invention of two brothers, Auguste and Louis Lumière—photographic goods manufacturers in Lyon, France—saw the most commercial success. In 1895, the brothers patented the cinématographe (the origin of the term cinema), a lightweight film projector that also functioned as a camera and printer. Unlike the Edison kinetograph, the cinématographe’s light weight made outdoor filming much easier, and over the years the brothers used the camera to take well over 1,000 short films, most of which depicted scenes from everyday life.

Believing that audiences would get bored watching scenes that they could just as easily observe on a casual walk around the city, Louis Lumière claimed that the cinema was “an invention without a future (Menand, 2005),” but a demand for motion pictures grew at such a rapid rate that soon representatives of the Lumière company traveled throughout Europe and the world, showing half-hour screenings of the company’s films. While cinema initially competed with other popular forms of entertainment—circuses, vaudeville acts, theater troupes, magic shows, and many others—eventually it would supplant these various entertainments as the main commercial attraction (Menand, 2005). Within a year of the Lumières’ first commercial screening, competing film companies offered moving-picture acts in music halls and vaudeville theaters across Great Britain. In the United States, the Edison Company, having purchased the rights to an improved projector that they called the Vitascope, held their first film screening in April 1896 at Koster and Bial’s Music Hall in Herald Square, New York City.

Modern audiences may find it difficult to understand film’s profound impact on its earliest viewers. However, the sheer volume of reports about the early audience’s disbelief, delight, and even fear at what they witnessed suggests that many people became overwhelmed when viewing a film. Spectators gasped at the realistic details in films such as Robert Paul’s Rough Sea at Dover, and at times people panicked and tried to flee the theater during films in which trains or moving carriages sped toward the audience (Robinson). Even the public’s perception of film as a medium differs considerably from the contemporary understanding; the moving image to them simply improved upon the photograph—a familiar medium for the viewers—and explains why the earliest films documented events in brief segments but did not attempt to tell stories. During this “novelty period” of cinema, the phenomenon of the film projector itself interested audiences, so vaudeville halls advertised the kind of projector they used (for example “The Vitascope—Edison’s Latest Marvel”) (Balcanasu, et. al.), rather than the names of the films (Britannica Online).

Early film screening
In December 1895, in the basement lounge of the Grand Café, Rue des Capucines in Paris, the Lumières held the world’s first-ever commercial film screening, a sequence of about 10 short scenes, including the brother’s first film seen above, Workers Leaving the Lumière Factory, a segment lasting less than a minute and depicting workers leaving the family’s photographic instrument factory at the end of the day (Encyclopedia of the Age of Industry and Empire). Source: Craig Duffy – Workers Leaving The Lumiere Factory – CC BY-NC 2.0.

By the close of the 19th century, as public excitement over the moving picture’s novelty gradually wore off, filmmakers began to experiment with film’s possibilities as a medium in itself (not simply as a tool for documentation, analogous to the camera or the phonograph). Technical innovations allowed filmmakers like Parisian cinema owner Georges Méliès to experiment with special effects that produced seemingly magical transformations on screen: flowers turned into women, people disappeared with puffs of smoke, a man appeared where a woman had just been standing, and other similar tricks (Robinson).

Not only did Méliès, a former magician, invent the “trick film,” which producers in England and the United States began to imitate, but he also transformed cinema into the narrative medium it is today. Whereas before, filmmakers had only ever created single-shot films that lasted a minute or less, Méliès began joining these short films together to create stories. His 30-scene Trip to the Moon (1902), a film based on a Jules Verne novel, likely stands as the most widely seen production in cinema’s first decade (Robinson). However, Méliès never developed his technique beyond treating the narrative film as a staged theatrical performance; his camera, representing the vantage point of an audience facing a stage, never moved during the filming of a scene. In 1912, Méliès released his last commercially successful production, The Conquest of the Pole, and from then on, he lost audiences to filmmakers who experimented with more sophisticated techniques (Encyclopedia of Communication and Information).

A Trip to the Moon
Georges Méliès’s Trip to the Moon incorporated fantasy elements and used “trick” filming techniques, both of which heavily influenced future filmmakers. Source: Georges Méliès creator QS:P170,Q152272, Le Voyage dans la lune, marked as public domain, more details on Wikimedia Commons

 The Nickelodeon Craze (1904–1908)

Edwin S. Porter, a projectionist and engineer for the Edison Company, broke with the stagelike compositions of Méliès-style films in his 12-minute film, The Great Train Robbery (1903), The film incorporated elements of editing, camera pans, rear projections, and diagonally composed shots that produced a continuity of action. Not only did The Great Train Robbery establish the realistic narrative as a standard in cinema, it also became the first major box-office hit. Its success paved the way for the growth of the film industry, as investors, recognizing the motion picture’s great moneymaking potential, began opening the first permanent film theaters around the country.

Known as nickelodeons because of their five-cent admission charge, these early motion picture theaters, often housed in converted storefronts, proved especially popular among the working class of the time, who couldn’t afford live theater. Between 1904 and 1908, around 9,000 nickelodeons appeared in the United States. The nickelodeon’s popularity established film as a mass entertainment medium (Dictionary of American History).

The “Biz”: The Motion Picture Industry Emerges

As the demand for motion pictures grew, production companies sprung up to meet it. At the peak of nickelodeon popularity in 1910 (Britannica Online), about 20 or so major motion picture companies operated in the United States. However, heated disputes often broke out among these companies over patent rights and industry control, leading even the most powerful among them to fear fragmentation that would loosen their hold on the market (Fielding, 1967). Because of these concerns, the 10 leading companies—including Edison, Biograph, Vitagraph, and others—formed the Motion Picture Patents Company (MPPC) in 1908. The MPPC served as a trade group that pooled the most significant motion picture patents and established an exclusive contract between these companies and the Eastman Kodak Company as a supplier of film stock. Also known as the Trust, the MPPC’s attempted to standardize the industry and shut out competition through monopolistic control. Under the Trust’s licensing system, only certain licensed companies could participate in the exchange, distribution, and production of film at different levels of the industry—a shut-out tactic that eventually backfired, leading the excluded, independent distributors to organize in opposition to the Trust (Britannica Online).

The Rise of the Feature

In these early years, theaters still ran single-reel films, which came at a standard length of 1,000 feet, allowing for about 16 minutes of playing time. However, companies began to import multiple-reel films from European producers around 1907, and the format gained popular acceptance in the United States in 1912 with the release of Louis Mercanton’s highly successful Queen Elizabeth, a three-and-a-half reel “feature,” starring the French actress Sarah Bernhardt. As exhibitors began to show more features—as term assigned to multiple-reel films—they discovered several advantages over the single-reel short. For one thing, audiences saw these longer films as special events and did not mind paying more for admission. In addition, the popularity of the feature narratives meant features generally experienced longer runs in theaters than their single-reel predecessors (Motion Pictures). Additionally, the feature film gained popularity among the middle classes, who saw its length as analogous to the more “respectable” entertainment of live theater (Motion Pictures). Following the example of the French film d’art, U.S. feature producers often took their material from sources that would appeal to a wealthier and better-educated audience, such as histories, literature, and stage productions (Robinson).

As it turns out, the feature film helped bring about the eventual downfall of the MPPC. The inflexible structuring of the Trust’s exhibition and distribution system made the organization resistant to change. When movie studio, and Trust member, Vitagraph began to release features like A Tale of Two Cities (1911) and Uncle Tom’s Cabin (1910), the Trust forced it to exhibit the films serially in single-reel showings to keep with industry standards. The MPPC also underestimated the appeal of the star system, a trend that began when producers chose famous stage actors like Mary Pickford and James O’Neill to play the leading roles in their productions and to grace their advertising posters (Robinson). Because of the MPPC’s inflexibility, independent companies exclusively capitalized on two important trends of film’s future: single-reel features and star power. Today, few people recognize names like Vitagraph or Biograph, but the independents that outlasted them—Universal, Goldwyn (which would later merge with Metro and Mayer to form MGM), Fox (later 20th Century Fox), and Paramount (the later version of the Lasky Corporation)—have become industry juggernauts.

Hollywood

As moviegoing increased in popularity among the middle class, and as feature films began keeping audiences in their seats for longer periods, exhibitors found a need to create more comfortable and richly decorated theater spaces to attract their audiences. These “dream palaces,” so called because of their often lavish embellishments of marble, brass, guilding, and cut glass, not only came to replace the nickelodeon theater, but also created the demand that would lead to the Hollywood studio system. Some producers realized that they could only meet the growing demand for new work if they produced films on a regular, year-round system. However, this proved impractical with the current system that often relied on outdoor filming and shot footage predominately in Chicago and New York—two cities whose weather conditions prevented outdoor filming for a significant portion of the year. Different companies attempted filming in warmer locations such as Florida, Texas, and Cuba, but producers eventually located an ideal candidate, a small, industrial suburb of Los Angeles called Hollywood.

Hollywood proved to be an ideal location for many reasons. The location combined a temperate climate with year-round sun. In addition, the abundantly available land did not cost much, and the location allowed close access to diverse topographies: mountains, lakes, deserts, coasts, and forests. By 1915, more than 60 percent of U.S. film production operated out of Hollywood (Britannica Online).

The Art of Silent Film

While commercial factors drove the development of narrative film, individual artists turned it into a medium of personal expression. The motion picture of the silent era appears generally simplistic; acted in overly animated movements to engage the eye; and accompanied by live music, played by musicians in the theater, and written titles to create a mood and to narrate a story. Within the confines of this medium, one filmmaker in particular emerged to transform silent film into art and to unlock its potential as a medium of serious expression and persuasion. D. W. Griffith, who entered the film industry as an actor in 1907, quickly moved to a directing role in which he worked closely with his camera crew to experiment with shots, angles, and editing techniques that could heighten the emotional intensity of his scenes. He found that by practicing parallel editing, in which a film alternates between two or more scenes of action, he could create an illusion of simultaneity. He could then heighten the tension of the film’s drama by alternating between cuts more and more rapidly until the scenes of action converged. Griffith used this technique to great effect in his controversial film The Birth of a Nation, which this chapter will discuss in greater detail in the next section. Other techniques that Griffith employed to new effect included panning shots, through which he established a sense of scene and engaged his audience more fully in the experience of the film, and tracking shots, or shots that traveled with the movement of a scene (Motion Pictures), which allowed the audience—through the eye of the camera—to participate in the film’s action.

MPAA: Combating Censorship

As film became an increasingly lucrative U.S. industry, prominent industry figures like D. W. Griffith, slapstick comedian/director Charlie Chaplin, and actors Mary Pickford and Douglas Fairbanks grew extremely wealthy and influential. The public had conflicting attitudes toward stars and some of their extravagant lifestyles: On the one hand, the public idolized celebrities and imitated them in popular culture, yet at the same time, they criticized them for representing a threat, on and off screen, to traditional morals and social order. And much as it does today, the news media liked to sensationalize the lives of celebrities to sell stories. Comedian Roscoe “Fatty” Arbuckle, who worked alongside future icons Charlie Chaplin and Buster Keaton, found himself at the center of one of the biggest scandals of the silent era. When Arbuckle hosted a marathon party over Labor Day weekend in 1921, one of his guests, model Virginia Rapp, was rushed to the hospital, where she later died. Reports of a drunken orgy, rape, and murder surfaced. Following World War I, the United States started to undergo significant social reforms, such as Prohibition. Many feared that movies and their stars could threaten the moral order of the country. Because of the nature of the crime and the celebrity involved, those fears became inexplicably tied to the Arbuckle case (Motion Pictures). Even though autopsy reports ruled that Rapp had died from causes that Arbuckle could not have caused, the comedian still faced trial, where a jury acquitted him of manslaughter, though his career would never recover.

The Arbuckle affair and a series of other scandals only increased public fears about Hollywood’s impact. In response to this perceived threat, state and local governments increasingly tried to censor the content of films that depicted crime, violence, and sexually explicit material. Deciding that they needed to protect themselves from government censorship and to foster a more favorable public image, the major Hollywood studios organized in 1922 to form an association they called the Motion Picture Producers and Distributors of America (later renamed the Motion Picture Association of America, or MPAA). Among other things, the MPAA instituted a code of self-censorship for the motion picture industry. Today, the MPAA operates by a voluntary rating system, which means producers can voluntarily submit a film for review to receive a rating designed to alert viewers to the age-appropriateness of a film, while still protecting the filmmakers’ artistic freedom (Motion Picture Association of America).

Silent Film’s Demise

In 1925, Warner Bros. needed to set itself apart from its competitors as a small Hollywood studio looking for opportunities to expand. When representatives from Western Electric offered to sell the studio the rights to a new technology they called Vitaphone, a sound-on-disc system that had failed to capture the interest of any of the industry giants, Warner Bros. executives took a chance, predicting that the novelty of talking films might help them make a quick, short-term profit. Little did they anticipate that their gamble would not only establish them as a major Hollywood presence but also change the industry forever.

The pairing of sound with motion pictures had occurred before this point. Edison, after all, had commissioned the kinetoscope to create a visual accompaniment to the phonograph, and many early theaters had orchestra pits to provide musical accompaniment to their films. Even the smaller picture houses with lower budgets almost always had an organ or piano. When Warner Bros. purchased Vitaphone technology, it planned to use it to provide prerecorded orchestral accompaniment for its films, thereby increasing their marketability to the smaller theaters that didn’t have their own orchestra pits (Gochenour, 2000). In 1926, Warner debuted the system with the release of Don Juan, a costume drama accompanied by a recording of the New York Philharmonic Orchestra; the public responded enthusiastically (Motion Pictures). By 1927, after a $3 million campaign, Warner Bros. had wired more than 150 theaters in the United States, and it released its second sound film, The Jazz Singer, in which the actor Al Jolson improvised a few lines of synchronized dialogue and sang six songs. The film constituted a breakthrough. Audiences, hearing an actor speak on screen for the first time, became enchanted (Gochenour). While radio, a new and popular entertainment, had siphoned audiences away from the picture houses for some time, with the birth of the “talkie,” or talking film, audiences once again returned to the cinema in large numbers, lured by the promise of seeing and hearing their idols perform (Higham, 1973). By 1929, three-fourths of Hollywood films had some form of sound accompaniment, and by 1930, the silent film became relics of the past (Gochenour).

“I Don’t Think We’re in Kansas Anymore”: Film Goes Technicolor

Although filmmakers could employ the techniques of tinting and hand painting color to films for some time (Georges Méliès, for instance, employed a crew to hand-paint many of his films), neither method worked for largescale projects. The hand-painting technique became impractical with the advent of mass-produced film, and the tinting process, which filmmakers discovered would create an interference with the transmission of sound in films, became abandoned with the rise of the talkie. However, in 1922, Herbert Kalmus’s Technicolor company introduced a dye-transfer technique that allowed it to produce a full-length film, The Toll of the Sea, in two primary colors (Gale Virtual Reference Library). However, because they only used two colors, the appearance of The Toll of the Sea (1922), The Ten Commandments (1923), and other early Technicolor films did not appear very lifelike. By 1932, Technicolor had designed a three-color system with more realistic results, and for the next 25 years, all studios produced color films with this improved system. Disney’s Three Little Pigs (1933) and Snow White and the Seven Dwarves (1936) and films with live actors, like MGM’s The Wizard of Oz (1939) and Gone With the Wind (1939), experienced early success using Technicolor’s three-color method.

Despite the success of certain color films in the 1930s, Hollywood, like the rest of the United States, felt the impact of the Great Depression, and the expenses of special cameras, crews, and Technicolor lab processing made color films impractical for studios trying to cut costs. Therefore, Technicolor would largely displace the black-and-white film until the end of the 1940s (Motion Pictures in Color).

Rise and Fall of the Hollywood Studio

The spike in theater attendance that followed the introduction of talking films changed the economic structure of the motion picture industry, bringing about some of the largest mergers in industry history. By 1930, eight studios produced 95 percent of all American films, and they continued to experience growth even during the Depression. The five most influential of these studios—Warner Bros., Metro-Goldwyn-Mayer, RKO, 20th Century Fox, and Paramount—practiced vertical integration; that is, they controlled every part of the system as it related to their films, from the production to release, distribution, and even viewing. Because they owned theater chains worldwide, these studios controlled which movies exhibitors ran, and because they “owned” a stock of directors, actors, writers, and technical assistants by contract, each studio produced films of a particular character.

The late 1930s and early 1940s sometimes get referred to as the “Golden Age” of cinema, a time of unparalleled success for the movie industry; by 1939, film ranked as the 11th-largest industry in the United States, and during World War II, when the U.S. economy flourished, two-thirds of Americans attended the theater at least once a week (Britannica Online). Some of the most acclaimed movies in history saw their release during this period, including Citizen Kane and The Grapes of Wrath . However, postwar inflation, a temporary loss of key foreign markets, the advent of the television, and other factors combined to bring that rapid growth to an end. In 1948, the case of the United States v. Paramount Pictures—mandating competition and forcing the studios to relinquish control over theater chains—dealt the final devastating blow from which the studio system would never recover. Control of the major studios reverted to Wall Street, where multinational corporations eventually absorbed the studios, and the powerful studio heads lost the influence they had held for nearly 30 years (Baers, 2000).

Who's going to the movies graph
Rise and Decline of Movie Viewing During Hollywood’s “Golden Age.” Source: Graph from Pautz, Michelle C. 2002. The Decline in Average Weekly Cinema Attendance: 1930–2000. Issues in Political Economy, 11 (Summer): 54–65.

Post–World War II: Television Presents a Threat

While economic factors and antitrust legislation played key roles in the decline of the studio system, the advent of television represents the primary factor. Given the opportunity to watch “movies” from the comfort of their own homes, the millions of Americans who owned a television by the early 1950s started attending the cinema far less regularly than they had only several years earlier (Motion Pictures). In an attempt to win back diminishing audiences, studios did their best to exploit the greatest advantages film held over television. For one thing, television could only broadcast black and white in the 1950s, whereas the film industry had the advantage of color. While producing a color film still required a considerable budget in the late 1940s, a couple of changes occurred in the industry in the early 1950s to make color not only more affordable but more realistic in its appearance. In 1950, as the result of antitrust legislation, Technicolor lost its monopoly on the color film industry, allowing other providers to offer more competitive pricing on filming and processing services. At the same time, Kodak came out with a multilayer film stock that made it possible to use more affordable cameras and to produce a higher quality image. Kodak’s Eastmancolor option became an integral component in converting the industry to color. In the late 1940s, studios only released 12 percent of features in color; however, by 1954 (after the release of Kodak Eastmancolor) more than 50 percent of movies featured color (Britannica Online).

Filmmakers also tried to capitalize on the sheer size and scope of the cinematic experience. With the release of the epic biblical film The Robe in 1953, 20th Century Fox introduced the method that nearly every studio in Hollywood would soon adopt: a technology that allowed filmmakers to squeeze a wide-angle image onto conventional 35-mm film stock, thereby increasing the aspect ratio (the ratio of a screen’s width to its height) of their images. This wide-screen format increased the immersive quality of the theater experience. Nonetheless, even with these advancements, movie attendance never again reached the record numbers it experienced in 1946, at the peak of the Golden Age of Hollywood (Britannica Online).

Mass Entertainment, Mass Paranoia: HUAC and the Hollywood Blacklist

The Cold War with the Soviet Union began in 1947, and with it came the widespread fear of communism, not only from the outside, but equally from within. To undermine this perceived threat, the House Un-American Activities Committee (HUAC) commenced investigations to locate communist sympathizers in America who they suspected of conducting espionage for the Soviet Union. In the highly conservative and paranoid atmosphere of the time, Hollywood, the source of a mass-cultural medium, came under fire in response to fears that studios had embedded subversive, communist messages in films. In November 1947, Congress called more than 100 people in the movie business to testify before the HUAC about their and their colleagues’ involvement with communist affairs. Of those investigated, 10 in particular refused to cooperate with the committee’s questions. These 10, later known as the Hollywood Ten, lost their jobs and received sentences to serve up to a year in prison. The studios, already slipping in influence and profit, cooperated to save themselves, and many producers signed an agreement stating that no communists would work in Hollywood.

The hearings, which recommenced in 1951 with the rise of Senator Joseph McCarthy’s influence, turned into a kind of witch hunt as committee members asked witnesses to testify against their associates, and a blacklist of suspected communists evolved. More than 324 individuals lost their jobs in the film industry as a result of blacklisting (the denial of work in a certain field or industry) and HUAC investigations (Georgakas, 2004; Mills, 2007; Dressler, et. al., 2005).

Down With the Establishment: Youth Culture of the 1960s and 1970s

Movies of the late 1960s began attracting a younger demographic, as a growing number of young people became drawn to films like Sam Peckinpah’s The Wild Bunch (1969), Stanley Kubrick’s 2001: A Space Odyssey (1968), Arthur Penn’s Bonnie and Clyde (1967), and Dennis Hopper’s Easy Rider (1969)—all revolutionary in their genres—that displayed a sentiment of unrest toward conventional social orders and included some of the earliest instances of realistic and brutal violence in film. These four films in particular grossed so much money at the box offices that producers began churning out low-budget copycats to draw in a new, profitable market (Motion Pictures). While this led to a rise in youth-culture films, few of them saw great success. However, the new liberal attitudes toward depictions of sex and violence in these films represented a sea of change in the movie industry that manifested in many movies of the 1970s, including Robert Altman’s M*A*S*H, Francis Ford Coppola’s The Godfather (1972), William Friedkin’s The Exorcist (1973), and Steven Spielberg’s Jaws (1975), all four of which saw great financial success (Britannica Online; Belton, 1994).

Blockbusters, Knockoffs, and Sequels

In the 1970s, with the rise of work by Coppola, Spielberg, George Lucas, Martin Scorsese, and others, a new breed of director emerged. These young, film-school-educated directors contributed a sense of professionalism, sophistication, and technical mastery to their work, leading to a wave of blockbuster productions, including Close Encounters of the Third Kind (1977), Star Wars (1977), Raiders of the Lost Ark (1981), and E.T.: The Extra-Terrestrial (1982). The computer-generated special effects that were available at this time also contributed to the success of a number of large-budget productions. In response to these and several earlier blockbusters, movie production and marketing techniques also began to shift, with studios investing more money in fewer films in the hopes of producing more big successes. For the first time, the hefty sums producers and distributers invested didn’t go to production costs alone; distributors started to discover the benefits of TV and radio advertising and found that doubling their advertising costs could increase profits as much as three or four times over. With the opening of Jaws, one of the five top-grossing films of the decade (and the highest-grossing film of all time until the release of Star Wars in 1977), Hollywood embraced the wide-release method of movie distribution, abandoning the release methods of earlier decades, in which a film would debut in only a handful of select theaters in major cities before it became gradually available to mass audiences. Studios released Jaws in 600 theaters simultaneously, and the big-budget films that followed came out in anywhere from 800 to 2,000 theaters nationwide on their opening weekends (Belton; Hanson & Garcia-Myers, 2000).

The major Hollywood studios of the late 1970s and early 1980s, now run by international corporations, tended to favor the conservative gamble of the tried and true, and as a result, the period saw an unprecedented number of high-budget sequels—as in the Star Wars, Indiana Jones, and Godfather films—as well as imitations and adaptations of earlier successful material, such as the plethora of “slasher” films that followed the success of the 1979 thriller Halloween. Additionally, corporations sought revenue sources beyond the movie theater, looking to the video and cable releases of their films. Introduced in 1975, the VCR became nearly ubiquitous in American homes by 1998 with 88.9 million households owning the appliance (Rosen & Meier, 2000). Cable television experienced slower growth, but ownership of VCRs gave people a new reason to subscribe, and cable subsequently expanded as well (Rogers). And the newly introduced concept of film-based merchandise (toys, games, books, etc.) allowed companies to increase profits even more.

The 1990s and Beyond

The 1990s saw the rise of two divergent strands of cinema: the technically spectacular blockbuster with special, computer-generated effects and the independent, low-budget film. Studios enhanced the capabilities of special effects when they began manipulating film digitally, noticeable in blockbusters Terminator 2: Judgment Day (1991) and Jurassic Park (1993). Films with an epic scope—Independence Day (1996), Titanic (1997), and The Matrix (1999)—also employed a range of computer-animation techniques and special effects to wow audiences and to draw more viewers to the big screen. Toy Story (1995), the first fully computer-animated film, and those that came after it, such as Antz (1998), A Bug’s Life (1998), and Toy Story 2 (1999), displayed the improved capabilities of computer-generated animation (Sedman, 2000). At the same time, independent directors and producers, such as the Coen brothers and Spike Jonze, experienced an increased popularity, often for lower-budget films that audiences would more likely to watch on video at home (Britannica Online). A prime example of this occurred at the 1996 Academy Awards program, when independent films dominated the Best Picture category. Only one movie from a big film studio earned a nomination—Jerry Maguire—while independent films earned the rest. The growth of both independent movies and special-effects-laden blockbusters continues to the present day.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Understanding Media and Culture Copyright © 2024 by North Idaho College is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book