When Florida’s lawyers tried to defend the state’s social media age restriction law by claiming it’s “well known” that platforms harm children, they probably werenR…
On one hand the Judge is right. On the other hand the lawyer is right. Then on two more hands, they’re both wrong.
Yes, it’s bad to legislate by moral panic. Yes, kids are addicted to social media. Those are both facts.
The reason age gating is a bad idea isn’t because of moral panic, or “the children”. It’s because we’re ALL addicted to social media. It isn’t just the kids, it’s adults as well. The problem is the intentionally addicting algorithms, meticulously engendered to keep us scrolling. I’m telling you in 50 years, we’ll know how all the social media companies were hiding and lying, about the addictive harmful nature of their business; Just like we know about tobacco and oil companies today.
The best solution I can think of, is to revisit Section 230. You can’t hold these companies responsible for what people post to their sites, but we can and must hold them accountable, for what they recommend! If you have a simple easily definable sorting or ranking system of what people choose to follow? You’re fine, no accountability for something bad showing up. If you have some black box algorithm of infinite scrolling, based on a complex criteria that nobody can really break down and explain exactly why a specific post was shown to a specific individual? Now you’re on the hook for what they see.
On one hand the Judge is right. On the other hand the lawyer is right. Then on two more hands, they’re both wrong.
Yes, it’s bad to legislate by moral panic. Yes, kids are addicted to social media. Those are both facts.
The reason age gating is a bad idea isn’t because of moral panic, or “the children”. It’s because we’re ALL addicted to social media. It isn’t just the kids, it’s adults as well. The problem is the intentionally addicting algorithms, meticulously engendered to keep us scrolling. I’m telling you in 50 years, we’ll know how all the social media companies were hiding and lying, about the addictive harmful nature of their business; Just like we know about tobacco and oil companies today.
The best solution I can think of, is to revisit Section 230. You can’t hold these companies responsible for what people post to their sites, but we can and must hold them accountable, for what they recommend! If you have a simple easily definable sorting or ranking system of what people choose to follow? You’re fine, no accountability for something bad showing up. If you have some black box algorithm of infinite scrolling, based on a complex criteria that nobody can really break down and explain exactly why a specific post was shown to a specific individual? Now you’re on the hook for what they see.