Slavery... Racisim... Corporate greed... Atom bombs used against civilians... 53 million aborted... A debt to be paid by generations yet to be born... An oubreak of child-offenders in public schools...
Does anyone think America is evil? Or, does she have evil elements that ruin her brand as a great nation?
**Because I have been accused of all kinds of things, let me tell you my full agenda here: I am Roman Catholic. The Church has been accused of everything but the common cold (but how can we truly know they didn't create it!!), and I am trying to get people to come around to the truth, that there is evil in all elements of society, religion, organizations, nations, etc., just as there are good in all of these. I am of the opinion that America is not evil, but has elements in her that undermine her, just as the Catholic Church (and most religions for that matter, including the JWs) stuggle with the same thing.**
I would like to avoid criticizing each other's religions and non-religions. I don't hope to convert anybody. I only want us to discuss America and our opinions about where she has been and where she is going, etc....Thoughts?