America is the self-proclaimed “land of the free and home of the brave,” but it has NEVER quite lived up to the title. From the time of its so-called “discovery” this country has been plagued with greed, invested with corrupt politics and has participated in the dehumanization of all of its inhabitants, except for white people.
The history of America is a brutal one. It is filled with blood and gore and pain and suffering. The infrastructure of this country was built on the backs of African slaves. America’s original sin was the genocide of the native people who walked this land years before its “discovery.” Once knowledge of this “free” land reached the Old World, it incited many voyages from colonizers hailing from England, France, Germany and of course Spain.
[So, sidebar here; it always disgusts me when racists say things like ‘go back to your country, speak English, we don’t want you here” America is literally a STOLEN land with STOLEN BELIEFS, STOLEN LANGUAGES, and STOLEN CULTURE.]
The colonizers soon realized they didn’t have the brains or brawn to till soil or build effectively so they instead chose to be LAZY and steal hundreds of thousands of human bodies and traffic them all around the world.