Slavery in America
Thinking about white guilt and the varying degrees of it in different regions around America, I decided to give myself a history lesson.
I’m as southern as they come, which means lots of different things to each person. But no matter what it means to some, to most everyone it means carrying around some guilt about slavery since birth. Whether you had anything to do with it or not. I’ve been taught to hang my head in shame for the transgressions of…who, exactly?
I wondered: where did the American slaves come from? Surely The American/European owners were’s in Africa chasing down people in the streets with butterfly nets. There were sources that produced them. Sources that are African.
Slaves that came to America were acquired in one of 5 ways, the most common being prisoners of tribal war caught by another African “nation.” So Africans were selling themselves. This is a practice that still exists in Northern Africa, incidentally, as do many illegal activities. Another “use” of slaves was for sacrifice however, so take your pick. We know which New Hampshire will choose.
Considering their status as slave began in Africa, and that it was ultimately Americans that freed them(in America, at least), why hold such hostility towards white America for so long? Why isn’t the ire directed towards those who enslaved blacks in the first place? And where did that position/attitude originate from? Universities? Blacks also owned slaves, and many slaves returned to work as they had, not having many other prospects, but better-off than they were, especially in Africa as prisoners or sacrifices.
Any theories or historians with a good reason?