What Changed?

 

Since we began this class we have read different viewpoints of the American West, and have discussed the romanticization of the West. In the first half of the 20th century, Hollywood was obsessed with making Western films. The stories of outlaws such as Jesse James, Billy the Kid, Wild Bill, and Wyatt Earp were a part of the Hollywood stories. Stars like John Wayne, Clint Eastwood, Roy Rogers, and Gene Autrey lead Hollywood by storm playing cowboys and lawmen of the American West and raking in millions of dollars. It is hard to refute the sensation for these stories/legends during this time in Hollywood, yet towards the end of the 20th century its popularity declines. Westerns are still being produced, however, they aren’t nearly as popular as they once were and no longer have big name stars.

During this time America has been in the Vietnam, Korean, and Cold Wars as well as going through a massive Civil Rights movement. Did these major political events affect the public’s need for Westerns? It might be a stretch, but the idea that just because these old stars got old and didn’t want to make any action Westerns anymore just doesn’t seem realistic. Had America wanted a high rate of Western films, Hollywood would have found new actors to lead the genre, just as Bruce Willis, Sylvester Stalone, and Arnold Schwarzenager lead the way for actors like Vin Diesel, Dwayne Johnson, and Daniel Craig. So what exactly happened to the Western? Did the American public have a sudden awakening that this idea of the “Old West” is more myth and fantasy than truth, or did the cultural influences of the time change the Hollywood focus to another genre. Furthermore, if there was a new take on Westerns, would Americans start to sensationalize the West again, or would it be a failure?

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php