don't know if i'm going that far - i know this thread/article is not really supposed to be deep cultural analysis - but it could effectively ( but maybe not decisively) argued that hollywood has done more to spread american values across the world than the military