Do you care what people think of you from a prejudice ideology?
What is more important to you, what you think of yourself or, what people who have never met you think you are probably like?
Does it make you feel good to pretend to be superior by putting others down?
Does being a citizen in the most powerful nation on earth make you better than everyone else?
What do you think makes the world hate the United States so much?
9/11 changed the world and America forever. For a brief moment the world stopped. Many people think that a valuable opportunity to come together as the human race was lost. Others think that it is time to abandon allies that will not stand beside the US. Do you have a opinion on this? What did America fail to do? What did it do right?
I watched a fantasitic (real) documentary called "In search of 9/11" it asked some questions like these. I highly recommend watching the program and answering yourself these questions if you have the time.