One thing I find really interesting in depictions of the Wild West in movies and TV is how it depicts gendered and racialized violence. In shows like “West World” women exist mostly to be sexualized, or for sexual violence to be enacted toward them. Similarly, the way any people of color (whether they be “Native Americans” (because the show, by nature, doesn’t specify which groups they come from) or the people coded to be Latinx (because again in shows like “West World” they don’t tend to specify) ) are portrayed is as a vehicle for the white characters’ racism. My question is this: Is depicting the “reality of the time” (as the writers of things like this often claim) worth potentially encouraging these prejudices to continue? Also, is it fair to claim that they’re striving for “accuracy” when the women often have perfectly shaved armpits, etc. or is it just an excuse to exhibit violence toward historically (and presently) oppressed people? Especially when in many circumstances the extremes depicted weren’t actually true?