History of American Imperialism
Imperialism is a policy that ensures the extension of a countries power and influence through diplomacy or military force. The manifest destiny was a 19thcentury belief that said the expansion was justified and bound to happen. There’s no doubt that these two go hand in hand. The American Imperialism brought both positive and negative effects […]