July 23, 2018

Hollywood Wasn't Always ‘Liberal’

Martina MarkotaRebel Contributor

Hollywood has always been political, but it hasn’t always been ‘liberal’. And by "liberal," I really mean "far-left."

In the 1920s, most of the studio heads spent a lot of time trying to block unions and guilds. They were Republicans. Let’s not forget that President Ronald Reagan came from Hollywood.

Today, the situation is very different, and the far-left has more control than ever before.

Comments
You must be logged in to comment. Click here to log in.
commented 2018-07-23 14:29:02 -0400
Martina said, Hollywood Wasn’t Always ‘Liberal’

But like all things, that which is good is destroyed over time as sickness (liberalism) insidiously seeps in.
commented 2018-07-23 14:10:36 -0400
(sic)Holywood. Overpaid pretenders.
commented 2018-07-23 13:20:00 -0400
Control their entertainment and you will control their minds. Control their education and you will control their actions. Control their feelings and you will control their hearts. Only then can you have them kill.

Quotation, Mefiveminago.