Racism

White people think racism is getting worse. Against white people.

The Washington Post: How do Americans think about the role of race in our country’s daily life? News reports, social media and uncomfortable dinner conversations often point to one conclusion: They disagree. Many white Americans believe that the United States has entered a post-racial phase; many black Americans believe that More