I mean seriously, Hollywood, can we just stop doing this woke crap? Because this year alone, I’ve walked away from three shows (more or less) that suddenly, out of the blue, started marching around with far-left ideologies and I’m really tired of it.
Can you please just knock it off?