Hollywood used to be the pride of America. Now, it’s a joke—and the numbers prove it. Only 29 percent of Californians, the very people living in Hollywood’s backyard, think the film industry is good for our culture. That’s less than one-third of the population in the bluest state in the country. Let that sink in.
This collapse didn’t happen by accident. Hollywood elites abandoned everyday Americans a long time ago. Instead of making movies that inspire and unite us, they spend every award show lecturing the public and pushing their far-left ideology. Regular families are fed up. People don’t want to be scolded about things like gender identity by millionaires who live inside their own bubble of privilege.
Liberals in the film industry love to pretend they represent the “voice of the people.” Yet the people have spoken—and they’re saying Hollywood is out of touch and tearing apart our national culture. These woke celebrities care more about globalist causes and impressing each other than telling real American stories. While Main Street struggles to afford groceries, Hollywood’s “social warriors” fly private jets to climate conferences and lecture us on morality.
Of course, the usual suspects on the left will ignore the glaring message. They turn a blind eye while middle America tunes out. Young people, parents, and even the folks in Los Angeles county are waking up to the fact that Hollywood is more interested in pushing politics than entertaining families. The entertainment industry likes to brag about being bold, but they refuse to listen when everyday Americans say “enough is enough.”
When less than a third of Californians (the heartland of liberal entertainment) trust Hollywood to help our culture, it’s clear—this industry has completely lost the plot. Maybe next year, the awards shows should start with a thank-you speech to the voters who are finally tuning out this woke nonsense. How much lower does Hollywood have to sink before they realize: America is over it?
Source: Breitbart
Leave a Reply