They prescribe medicine that's supposed to "fix" us, They create wars that are supposed to "help" us, They send us to schools to try and "teach" us, They give us news to "inform" us, but why can't They tell us the truth instead of what They want us to know?