I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
Incorrect use of the word literally and racist. I told you I’m just xenophobic (against southerners and the south.)
Whoever upvoted you is literally at least as stupid as you and it’s fucking sad.
Yes yes… Only just a simpleminded stupid dumbfuck… So much better
56% of black people in America are living in the South, so when you people constantly crap on the South I’m more than certain, almost POSITIVE you’re talking about black counties and towns, you’re just trying to be sneaky about not saying the quite part out loud and avoiding it at all costs.
I’m also willing to bet you think places that are majority white are the best places to live too, as that’s usually the other part of the equation. I’ve been around long enough to know what white flight is
It seems like you expect the commenters to be white.