Definition, synonyms and related words
Describing a tendency to view the world from the perspective of America, with an assumption that the United States is superior to other countries