Bo Jacobs 2 months ago White Americans are realizing the United States is everything black Americans have been telling them it is for centuries.