As far as I'm concerned, the United States is in North America, just like Canada and Mexico. So why do Americans think "America" means only the US? So often I hear Americans talk about the US as "America".
Canadians don't refer to Canada as "America". But we are just as much America as the US.
As a Canadian, I feel the US thinks it is the center of the world, or atleast North America.