Why is it hollywoods job to be diverse and representative of everyone? They’re in it to make money. They owe us nothing.

aconissa:

I know they’re in it to make money and I know they don’t care about us, but that doesn’t mean it’s fucking okay? This logic is so messed up, like ‘people are racist because it benefits them so why should they change their behaviour’ BECAUSE IT’S FUCKING RACIST AND THAT’S NOT ACCEPTABLE HOW IS THAT EVEN A QUESTION

Because part of Hollywood’s job is to reflect society. Hollywood has a tremendous influence on culture, and what we see up on the big screen has an effect on the world around us. If gay people are always jokes, that teaches society not to take them seriously. If black people are always thugs, then that teaches society to expect that in real life.

There are entire books about ethics and Hollywood. Hollywood should be diverse because that is the right and ethical thing to do. Because they do have such an influence, not just in the US, but worldwide. 

‘I Don’t Know Whether to Kiss You or Spank You’: A Half Century of Fear of an Unspanked Woman

‘I Don’t Know Whether to Kiss You or Spank You’: A Half Century of Fear of an Unspanked Woman