“American” is the official English word when referring to people or things from the United States, but it heavily implies that it means either all of North America or all of North and South Americas. Most other languages have different words for American (country) and American (continents). If there were a campaign to replace the word “America” with something else when referring to the US, what would you think of it?
An argument I read in the Pleroma side of the Fediverse was that USA is the only country in America that matters.
That’s not an argument, it’s just xenophobia and exaggerated patriotism