Welcome to Westonci.ca, your one-stop destination for finding answers to all your questions. Join our expert community now! Discover a wealth of knowledge from experts across different disciplines on our comprehensive Q&A platform. Explore comprehensive solutions to your questions from knowledgeable professionals across various fields on our platform.

Is the United States of America an empire? If so, where is the empire, and how has it changed since 1900? If not, was it ever? If it was but no longer is an empire, when did it end, and what led to its demise? please historical evidence

Sagot :

Answer:

Because the United States does not seek to control territory or govern the overseas citizens of the empire, we are an indirect empire, to be sure, but an empire nonetheless.