I’m aware that, at the moment, Florida is deemed unsafe for people to travel to, but what is generally the worst state to live in? Factors such as education, religious extremism, crime, cost of food, healthcare, and access to other resources are relevant. Will “Deep South” states remain one of the worst places to live in due to traditionalism, or are more progressive states bound to grow worse?
Removed by mod
I know, I understand it’s weird and I also made fun of it when I first saw a warning sign in Starbucks that coffee may contain some cancer causing chemical or in the parking garages and carports of apartment complexes that warn about car emissions in enclosed spaces causing cancer.
However, there are people that work in certain professions that may have abnormally high exposure to certain risks and would otherwise not know. In this example, a car attendant or valet might appreciate the warning about working in a parking garage.
I recently learned that those who carve and drill quartz countertops are all having lung issues and dropping dead at 40. I’m sure they wish California had given them one of those silly warnings. Same with the Radium girls. I’m all for erring on the side of caution lest someone find out too late.
Removed by mod
Too many warnings are as good as no warnings. How do you know when it’s trivial exposure vs serious?
Prop 65 warnings, while well intentioned, turned into a way for shitty lawyers to make money sueing erebody. It doesn’t do anyone any good cause people will just put that generic warning on everything to CYA.