Most everyone universally sees that the recent era of the TV show 'The Simpsons' is no longer the force and influence it once was back in the 90's. That's assuming you've been watching the show since then. What led to the decline of this culturally iconic series? Here's a video essay looking precisely into that, which has to do with much more than just a change of writers, including considering the era of Television landscape and politics of the time which the Simpsons intentionally satires during its early seasons.