I just thought I'd shoot out a things on my mind, as a Black male living in America. I see a bunch of stuff out there about everyday that makes me realize how racist our society still is. I'm fully aware that it's been worse in the past, and it's worse in other areas of the world, but I don't feel like what we have today should be ignored. I love bantering about race related issues in America (especially coming from a Black man's perspective, since...that's what I am), and I figured I'd finally do something about it in a therapeutic manner. My stuff will probably be slanted towards Black-White relations, but this isn't meant to be a "kill Whitey" deal. I'm fully aware that racism exists to every race, including Whites, but for the most part, I can only speak from my perspective.
With that being said, it's time for me to point out why...America is Racist!
No comments:
Post a Comment