There is no such thing as being color blind. Its a nice thought, but its not the reality for many people. They are being judged, targeted by police or denied opportunities based upon skin color. Conservatives have accused Hollywood of being liberal based upon a propensity of actors holding liberal views and the industry as a whole being way more laid back about sexual mores. Hollywood has been an easy target for authoritarians who demand ideological purity and conformity, but that hardly makes the people who run Hollywood free of racial, gender, religious and other forms of bias (aka sexual orientation). People of color, women and gays have been marginalized on and off screen for most of Hollywood’s existence.
Chris Rock recently spoke about the lack of opportunities for black actors and black executives green lighting projects or blacks working behind the scenes. Actresses have been spoken about their lack of good roles for years. They are either reduced to sex objects, or love interests and tend to lose work when they age unless they win an Oscar. Gay actors still remain in the closet worried they might lose out on acting jobs.
Rarely does Hollywood green light a film that is not directed at young white males. They are pretty much ignoring the majority of society. Rarely do women, gays or people of color get to direct films in Hollywood. I think the percentage of women directing films in Hollywood is 6%. I don’t know the statistics for people of color or openly gay directors off hand, but I think its safe to say they are not getting many opportunities either. There’s only been about a dozen or so of black, Latino, Asian and female nominees in the Best Director category in the 80 year history of the Academy Awards. It wasn’t until the last few years that a black man and white woman have actually won the award.