Myths of Manifest Destiny

We were taught a lot of misleading and straight up incorrect ideas about how America came to beTweet Manifest Destiny is the idea that the United States was destined by God to expand into the vast "empty" land out West. Justified by beliefs that pioneers could use the land more productively then Native Americans could. Capitalism, Religion … Continue reading Myths of Manifest Destiny