American imperialism
American imperialism consists of policies aimed at extending the political, economic, media and cultural influence of the United States over areas beyond its boundaries. Depending on the commentator, it may include military conquest, gunboat diplomacy, unequal treaties, subsidization of preferred factions, economic penetration through private companies followed by a diplomatic or forceful intervention when those interests are threatened, or regime change.[1][2]
The policy of imperialism is usually considered to have begun in the late 19th century,[3] though some consider US territorial expansion at the expense of Native Americans to be similar enough to deserve the same term.[4] The United States has never officially referred to its territorial possessions as an empire, but some commentators refer to it as such, including Max Boot, Arthur Schlesinger, and Niall Ferguson.[5] The United States has also been accused of neocolonialism, sometimes defined as a modern form of hegemony, which uses economic rather than military power in an informal empire, and is sometimes used as a synonym for contemporary imperialism.
The question of whether the United States should intervene in the affairs of foreign countries has been debated in domestic politics for the whole history of the country. Opponents pointed to the history of the country as a former colony that rebelled against an overseas king, and American values of democracy, freedom, and independence. Supporters of the Presidents labelled as imperial including Andrew Jackson, James K. Polk, William McKinley, Theodore Roosevelt, and William Howard Taft justified interventions in or seizure of various countries by citing the need to advance American economic interests (such as trade and repayment of debts), the prevention of European intervention in the Americas (known as the Monroe Doctrine), and the benefits of keeping good order around the world.