When was the last time France, Spain, England, Finland or Canada or pretty much any decent country was "an Empire?"
They seem to be pretty free and happy places to live.
Why can't America be a free and happy place to live without worrying so much about being "an Empire?"
Being "an Empire" seems entirely overrated, I don't see why anyone would even want to be in one.
quote:
Empire
Definition: an extensive group of states or countries under a single supreme authority, formerly especially an emperor or empress.
Going by the strict definition, the USA has never been an empire. In fact, it was founded as part of a revolution against an existing empire. Namely, the British Empire.
The USA is a powerful country and we'd all be lying if we said it didn't flex its muscle from time to time to get what it wants. But that is a far cry from conquering and subjugating other states/countries. Not to mention we have a democracy which means there isn't any Emperor or Empress running things. Although Trump may disagree.
Incidentally, even when empires 'fall', it doesn't necessarily mean they wink out of existence. The Roman empire fell. Yet Italy still exists. The Macedonian Empire under Alexander the Great fell. Yet Greece still exists. Britain, Spain, France, Turkey, Mongolian. All had empires. Yet all the core countries are still intact and for the most part, are still prosperous.