and ask a question. The western world has been vilified in the last 30-40 years for what happened with colonizing the uncivilized world. Americans have been vilified for taming the west and teaching savages civilized ways. Do you feel that stories of barbarians and uncivilized beasts of today not vindicate much of the expansionism that went on during the 18th and 19th centuries?
0 comments:
Post a Comment