I mean seriously, Hollywood, can we just stop doing this woke crap? Because this year alone, I’ve walked away from three shows (more or less) that suddenly, out of the blue, started marching around with far-left ideologies and I’m really tired of it.
Can you please just knock it off?
There are some shows that I don’t even start watching until they finish their season so I can binge the whole thing. One of those, until very recently, was Prodigal Son. The first season was excellent, although certainly not perfect and most things with Michael Sheen, I tend to like.