Comment by ainiriand

2 days ago

When you say America you certainly mean USA? Or is America a country now?

Technically, America is neither a country nor a continent. But the USA is the only country on either of the continents of the Americas to have the word "America" in its official name. Give the Americans a break.

I think this is a cultural difference around the world. In my county, people call it America. USA sounds a bit wanky or what Americans themselves call it. It doesn't matter because everyone knows what you mean.