Not only that the mainstream news media and career politicians are hugely responsible into brainwashing people and trying all their best to turn people into liberals, Hollywood movies and TV shows have liberalism all over them. Liberalism is everywhere and being rammed down our throats everyday. It’s only getting worse.
Leftist views are all over Hollywood movies these days and even some of your favorite TV shows as well. For example, the new Mad Max movie w/ Tom Hardy and Charlize Theron is a big time leftist movie which promotes feminism.
If you take a look at these lists in the links below, you would be surprised of what movies & TV shows that has liberal views in them and the lists blows my mind to be honest with you.
All the TV shows you guys love like, “Game of Thrones”, “Dexter”, “Breaking Bad”, “The Following”, “Dr. Who”, “Orange Is the New Black”, “House of Cards”, etc. They all promote leftist views in some way.
Why is “Game of Thrones” a leftist TV show? Well for one, the show glorifies homosexuality. The show is full of gay sex scenes all over.
The mainstream media, Hollywood films & TV shows would do all they can to ram liberalism down our throats and try to force us to change our views in politics. The entertainment industry is doing a good job of that too.
I may like some of these movies and shows myself but I just like them for the stories and the acting, I don’t really care about the leftist views.
There are a handful of movies that has conservative views, though and here is a bunch of films that promote “right-wing” views, the list is pretty long:
I love how they list Ghostbusters, Star Wars, Indiana Jones and the Expendables movies as “right-wing” films ’cause I agree those films do have “right-wing” views a little bit.