Emily Hanley says she and other out-of-work copywriters are only the first wave of AI collateral and calls the collapse of her profession the "tip of the AI iceberg."
Bald-faced appeal to authority, okay. With a side of putting words in my mouth that I clearly did not say.
The industrial revolution destroyed some jobs, and created others. Destroyed some industries, and created others. We’ve been in an “information revolution” for some time, where electronic computers have supplanted human computers, and opened up an enormous realm of communication, discovery, and availability of information to so many more people than ever before in history. This is simply true.
Just as the landscape of human physical labor was forever changed by the industrial revolution, the landscape of human thinking labor will continue to be forever changed by this information revolution. AI is a potential accelerator of this information revolution, which we are already seeing the impacts of, even at this extremely early stage in the development of AI. There will be both good and bad outcomes.
Appealing to authority is useful. We all do it every day. And like I said, all it should do is make you question whether you’ve really thought about it enough.
Every single thing you’re saying has no bearing on how AI will turn out. None.
If a 0 is “we figured it out” and 1 is “we go extinct”, here is what all possible histories look like in terms of “how things that could have made us go extinct actually turned out”:
1
01
001
0001
00001
000001
0000001
00000001
etc.
You are looking at 00000000 and assuming there can’t be a 1 next, because of how many zeroes there have been. Every extinction event will be preceded by a bunch of not extinction events.
But again, it is strange that you can label an appeal to authority, but not realize how much worse an “appeal to the past” is.
Nope. I certainly have. It’s the same arguments I’ve been hearing from people dismissing AI alignment concerns for 10 years. There’s nothing new there, and it all maps onto exactly the wishful thinking I’m talking about.
You understand that the fallacy is the appeal to false authority, right? Not just any authority?
Swinging the partial names of logical fallacies around like a poorly wielded shield isn’t actually an argument. It’s just an attempt to poison the well.
Bald-faced appeal to authority, okay. With a side of putting words in my mouth that I clearly did not say.
The industrial revolution destroyed some jobs, and created others. Destroyed some industries, and created others. We’ve been in an “information revolution” for some time, where electronic computers have supplanted human computers, and opened up an enormous realm of communication, discovery, and availability of information to so many more people than ever before in history. This is simply true.
Just as the landscape of human physical labor was forever changed by the industrial revolution, the landscape of human thinking labor will continue to be forever changed by this information revolution. AI is a potential accelerator of this information revolution, which we are already seeing the impacts of, even at this extremely early stage in the development of AI. There will be both good and bad outcomes.
Appealing to authority is useful. We all do it every day. And like I said, all it should do is make you question whether you’ve really thought about it enough.
Every single thing you’re saying has no bearing on how AI will turn out. None.
If a 0 is “we figured it out” and 1 is “we go extinct”, here is what all possible histories look like in terms of “how things that could have made us go extinct actually turned out”:
1
01
001
0001
00001
000001
0000001
00000001
etc.
You are looking at 00000000 and assuming there can’t be a 1 next, because of how many zeroes there have been. Every extinction event will be preceded by a bunch of not extinction events.
But again, it is strange that you can label an appeal to authority, but not realize how much worse an “appeal to the past” is.
You don’t seem to have actually read anything I’ve written, and just want to argue with someone.
Nope. I certainly have. It’s the same arguments I’ve been hearing from people dismissing AI alignment concerns for 10 years. There’s nothing new there, and it all maps onto exactly the wishful thinking I’m talking about.
Based on the fact that I have not anywhere “[dismissed] AI alignment concerns,” I stand by the above statement.
You understand that the fallacy is the appeal to false authority, right? Not just any authority?
Swinging the partial names of logical fallacies around like a poorly wielded shield isn’t actually an argument. It’s just an attempt to poison the well.