It seems to me that many Americans are convinced that Europeans are for ever slagging off the USA for some reason or another.
Just to set the record straight, America bashing is relatively rare in Europe, most of us regard the USA with a not inconsiderable degree of affection. Obviously there are cultural differences between the continents that can lead to feelings being trampled on, but America-hating persons are quite rare over here. The few real antagonists are very vociferous, but like most fundamentalists, regaded as being completely loony.
Maybe the Americans who have visited over here have an opinion?
Englishman.