I know people have been scared by new technology since technology, but I’ve never before fallen into that camp until now. I have to admit, this really does frighten me.
What’s wild to me is how Yann LeCun doesn’t seem to see this as an issue at all. Many other leading researchers (Yoshua Bengio, Geoffrey Hinton, Frank Hutter, etc.) signed that letter on the threats of AI and LeCun just posts on Twitter and talks about how we’ll just “not build” potentially harmful AI. Really makes me lose trust in anything else he says.
To make that statement a little more accurate, I’m afraid of the humans that will abuse this technology and societies ability to adapt to it. There’s some amazingly cool things that can come about from this, like all the small indie creators that lack the connections and project management skills to make their ambitions come to life will be able to achieve their vision, and that’s really cool and I’m excited for that, but my excitement is smashed from knowing all the bad that will come with this.
I know people have been scared by new technology since technology, but I’ve never before fallen into that camp until now. I have to admit, this really does frighten me.
Boo!
What’s wild to me is how Yann LeCun doesn’t seem to see this as an issue at all. Many other leading researchers (Yoshua Bengio, Geoffrey Hinton, Frank Hutter, etc.) signed that letter on the threats of AI and LeCun just posts on Twitter and talks about how we’ll just “not build” potentially harmful AI. Really makes me lose trust in anything else he says.
To make that statement a little more accurate, I’m afraid of the humans that will abuse this technology and societies ability to adapt to it. There’s some amazingly cool things that can come about from this, like all the small indie creators that lack the connections and project management skills to make their ambitions come to life will be able to achieve their vision, and that’s really cool and I’m excited for that, but my excitement is smashed from knowing all the bad that will come with this.