“It’s really political,” she says. “There’s an atmosphere where we don’t feel safe to talk [publicly] because we’re afraid our opinions and our thoughts can work against us. We’re constantly in this fear that this could hurt me from getting classes assigned next semester or my comments could hurt me when I’m trying to apply for a position.”
Why is it that academia and Hollywood, the two institutions that are always pranging on about workplace fairness and social equity, are the two biggest offenders in those areas? Should I be surprised that these are two highly secularized portions of our society?