I'm just a little puzzled by this. I'm Canadian and I've been to the states hundreds of times. Every time I go and people are aware I'm Canadian, they act so ignorant around me. It really bothers me. I don't hate the US or anything, but I must say that some Americans are excruciatingly ignorant (not to say that no Canadians are ignorant). I just feel like the US doesn't take into consideration that Canada is their number one resource for oil, tourism, and trade. Americans are so quick to judge Canada, and most Americans have never even BEEN to Canada. I could name all fifty states, but if I asked an American to name five Canadian provinces, chances are they couldn't do it. But what really bothers me is how Americans seem to be proud of their ignorance. Some of them seem to be PROUD of the fact that they know nothing outside the borders of their country, and that bothers me. And, I know I'm going to get hate for this, but most Americans I've talked to seem pretty dumb. I mean, the country that practically decides if we go to war is the same country that tried to make pizza a vegetable in 2010. Whenever there's a shooting, they're always "shocked and troubled," but what do you expect to happen with such passable gun laws? What's the big deal with gay marriage? If two people love each other let them get married. What's the big deal with abortion? It's the woman's choice, not the federal government's. Why is religion such a big deal? if you're religious that's cool, if you aren't that's cool too.
I'm sorry but I just feel as if I needed to vent about my frustrations. I don't care if any of you Americans go apeshit in the comments, I really don't. It just proves that you can't handle someone talking down to America instead of the other way around for once. I just wonder if anyone agrees with me.
And, like I said, I don't hate the US. I just think the United States is run on hypocrisy and narcissism.