← Back to context

Comment by ainiriand

16 hours ago

When you say America you certainly mean USA? Or is America a country now?

Technically, America is neither a country nor a continent. But the USA is the only country on either of the continents of the Americas to have the word "America" in its official name. Give the Americans a break.