“American” is the official English word when referring to people or things from the United States, but it heavily implies that it means either all of North America or all of North and South Americas. Most other languages have different words for American (country) and American (continents). If there were a campaign to replace the word “America” with something else when referring to the US, what would you think of it?

  • Ravn
    link
    fedilink
    43 years ago

    That’s what I use. Didn’t know it was a leftist thing, though.

    • @AgreeableLandscape@lemmy.mlOP
      link
      fedilink
      23 years ago

      I’m pretty sure it isn’t, but a lot of leftists do use it to exclude the rest of the Americas when talking about how the US screws up