Skip to main content

Hollywood’s Decline Is an American Tragedy

source: Washington Examiner

The movie industry has earned every ounce of derision it receives. Hollywood’s commitment to pumping left-wing cultural programming into the nation’s bloodstream has driven its decadeslong fade into irrelevance. The industry morphed into a propaganda machine.

read the article...