Just wondering what exactly Americans (U.S.A) are taught about their country in school .Do you really salute a flag, do you swear allegiance to the country. Are you taught that you fought for freedom, and that America is THE land of the free, etc etc? Do you actually believe what you are taught.
Well I am just one American so I can't speak for everyone....seems kind of presumptuous to even try, wouldn't you say? It's been awhile since I have been in school but I'll cast my mind back and see....
Yes I did salute the flag and say the allegiance. I didn't become a JW until I was 12 and most grades to about 5th I believe salute the flag. I was also in school on a military base as my father was in the airforce and I spend most life from birth to 12 when my dad retired living on bases, ergo I would probaby be more likely to be in a school where this was advocated. I wasn't brainwashed from the experience obviously as I later became a JW, swearing my allegiance to a theocracy. Now I keep my allegiance in reserve.
All I recall being taught regarding American History was about Pilgrims and Paul Revere....sorry I didn't pay a lot of attention in school. My basic understanding is that America is considered "home of the free" because so many people fled here to escape oppression in their own countries...apparently some continue to do so... The name just stuck I suppose...
Did I actually believe what I was taught? lol do you believe what you were taught in school? I'm sure up to a point I did...but as you can see it didn't make that big of an impact on me.
I'm sorry it bothers you so much that some Americans love their country. Might want to spend some time pondering why that is though...just a thought
PS I'm sure the serfs in Merry Ole England will feel comforted to know they weren't considered slaves..sure that took the sting off having a hand chopped off for poaching or not being allowed to marry whom they wanted