Kalamazoo, MI
one-hundred-forty-one Years of Service to the Student


Sexism in Hollywood is Not a Surprise

Patricia Arquette delivering her Oscar speech after receiving her award for Best Supporting Actress. (Photo courtesy ABC)

About a month ago, the nominees for the 87 Academy Awards were announced, resulting in an uproar from the Internet. The fact that the acting nominees were all white and the directing, cinematography, and writing nominees were all men made many people ask “What’s wrong with Hollywood?” I even took pen to paper and examined the predominantly male whiteness Hollywood seems to churn out year after year, despite the fact that America is growing more and more diverse.

This year’s Academy Awards had a strong social media presence, with viewers tweeting, reblogging, and updating their Facebook statuses about the awards in real time.

Whether it was The Representation Project calling on the media to “#AskHerMore” on the red carpet or Patricia Arquette demanding wage equality in an uplifting, although still problematic, acceptance speech, the Academy Awards this year were more about social justice than ever.

But this doesn’t mean women in Hollywood aren’t continuously thrown under the bus.

The Huffington Post listed some statistics about women’s inequality in Tinseltown: the majority of the Academy’s voting members are 94 percent white and 77 percent male, and of the 43 people on the board of governors, only six are women. In its 83 year history, only one woman, Kathryn Bigelow,  has ever won Best Director, and research from San Diego State University shows that in 2013, 79 percent of the top 250 films had no female writers.

There is a severe representation problem in Hollywood, and I’m not just talking about who appears on the big screen. Exclusion happens behind the camera as well, and it’s pushing women and people of color further away from the glitter and gold of Hollywood.

When the incredibly diverse American public goes to the movies and only sees the stories of white men represented, it tells the American public that only the stories of white men are rewarded by box office success and gold statuettes.

There is a way to make Hollywood take notice. Support movies that feature realistic female characters, have diverse characters, or are made by women or people of color.

Tell your friends about your mission and why you’re doing it. Get angry about it, and Hollywood will listen.

Leave a comment

Your email address will not be published.


Sexism in Hollywood is Not a Surprise