Outside of a global (country by country) comparative analysis; if you have a valid basis and rationale I'd love to hear it. Not just for the white majority but for ALL. Outside of today in 2016 (which to me is the greatest and most inclusive America has ever been) when was America truly great in all aspects and a place that was fair and balanced for ALL peoples. For you old heads break it down by decade if you have to but answer it honestly on the merits backed up by logic backed up by hard cold FACTS.
Thanks
Originally Posted by Sistine Chapel
Defeat of the Germans and the Japanese in WWII.
Things in America have never been better for blacks than now, but so many complain so much I don't know when you people will ever be happy with America...speaking of which:
What does America have to do to stop being accused of being racist by black people?
America was never perfect for everyone, but it sucks right now. It has never been so divided, never had such a poisonous environment, never seemed so ungovernable. It used to be more free.
I see rioting and civil unrest just getting worse until the whole thing falls apart.