“American” is the official English word when referring to people or things from the United States, but it heavily implies that it means either all of North America or all of North and South Americas. Most other languages have different words for American (country) and American (continents). If there were a campaign to replace the word “America” with something else when referring to the US, what would you think of it?