I know the USA is not perfect. There have always been social issues and differences. Politics has been for many people divisive. The news media can be very biased. Some people want socialism and others abhor it.
Even if you don’t live in the United States what’s your take on this country? Would you want to live here?If you live here, do you ever think of leaving? Is it as good or bad as some people say???