Why is it that many of the people trying to take over the world (now and then) seem to be Christians and promoters of Capitalism?
Why does the United States (a predominantly Christian country) fear Islam so much when for centuries Christians have been terrorizing most of the world?
Does Christianity promote violence by being a 'house on a hill'? Or by needing to spread its gospel?
And on a perhaps more controversial note:
Why are most of these people white or of European/Western decent?
(Keep in mind, "all people are created equal.")