State media would report that a man using his computer suddenly fell to his death from the roof of his appartment. Neighbors say the man didn’t live in an apartment. 🤷♂️
Not just that, remember it’s just an LLM. It quantifies which tokens (or words and letters, if you will) come up next. It doesn’t matter if it’s factual or fictional - if it’s good enough, it does that.
LLMs are very confident in lying. I once asked it if there is a magic method to catch magic method calls in PHP - it told me its __magic. Lo and behold, there is and never was such a method. That’s my first and last time I tried it, and there ain’t gonna be a second time in the near future.
I asked Snapchats AI thing if it had internet. Yes, it very confidently proclaimed. This was around when Tears of the Kingdom had been released on the switch so I asked what the latest Zelda game was and it was something much much older. Of course that doesn’t necessarily mean anything so I kept prodding for other recent news and was provided nothing.
My partner asked the same of it and was told very confidently that no, it does in fact not have access to the internet and proceeded to give some long-winded explanation of why.
Can’t trust those things to say anything correct since they’re just doing what they’ve been constructed to do. String words together into sentences.
It can be useful in learning the basics of technology that’s completely unfamiliar to you.
It’s kinda like how it’s fine to use a wikipedia article as a starting point for research on a subject. But it’s not a good idea to use wikipedia as an authoritative source on a subject.
It’s also useful as a reminder for for technology you haven’t used in a while. You can fairly quickly get the ordering of the parameters of a well known method and be on your way a lot faster than a google search or going back through your code to find where you’ve done it before.
It’s also good for mundane tasks like making a class that’ll handle a specific JSON request or handle some data coming from a table in the DB. You know the things where you’re just copying and pasting some property names and entering the corresponding types. Just put in “class for: {“blah”: 69 }” or whatever and save a little time.
But yeah it’s not going to know anything too technical, it’s not going to know anything about less common problems, it’s not going to know the best algorithm to use, etc. But if you’re just using it for some basic ass shit, it works well enough.
Wikipedia is usually sourced when it comes to scientific stuff, so it’s a reliable source of information (usually).
But yeah, for simple and mundane stuff you can check and know instantly if it’s good or not, I totally understand. But for more complex or unknown advanced stuff, not so much.
Just wait until AI starts rewriting history, changing historical facts, and purposely misinforms people.
It’s only a matter if time before AI will deny the Holocaust, Black slaves in the US, and the numerous African genocides.
Cough 1989 Tiananmen Square protests and massacre cough
Chinese AI bot says “what”?
Your computer just explodes killing you instantly.
What a tragic accident…
That’s the Russian chatbot.
State media would report that a man using his computer suddenly fell to his death from the roof of his appartment. Neighbors say the man didn’t live in an apartment. 🤷♂️
In Russia, Apartments fall on you
In Russia, windows fall out of you
The Google chatbot is already doing just that.
Not just that, remember it’s just an LLM. It quantifies which tokens (or words and letters, if you will) come up next. It doesn’t matter if it’s factual or fictional - if it’s good enough, it does that.
LLMs are very confident in lying. I once asked it if there is a magic method to catch magic method calls in PHP - it told me its
__magic
. Lo and behold, there is and never was such a method. That’s my first and last time I tried it, and there ain’t gonna be a second time in the near future.I asked Snapchats AI thing if it had internet. Yes, it very confidently proclaimed. This was around when Tears of the Kingdom had been released on the switch so I asked what the latest Zelda game was and it was something much much older. Of course that doesn’t necessarily mean anything so I kept prodding for other recent news and was provided nothing.
My partner asked the same of it and was told very confidently that no, it does in fact not have access to the internet and proceeded to give some long-winded explanation of why.
Can’t trust those things to say anything correct since they’re just doing what they’ve been constructed to do. String words together into sentences.
It can be useful in learning the basics of technology that’s completely unfamiliar to you.
It’s kinda like how it’s fine to use a wikipedia article as a starting point for research on a subject. But it’s not a good idea to use wikipedia as an authoritative source on a subject.
It’s also useful as a reminder for for technology you haven’t used in a while. You can fairly quickly get the ordering of the parameters of a well known method and be on your way a lot faster than a google search or going back through your code to find where you’ve done it before.
It’s also good for mundane tasks like making a class that’ll handle a specific JSON request or handle some data coming from a table in the DB. You know the things where you’re just copying and pasting some property names and entering the corresponding types. Just put in “class for: {“blah”: 69 }” or whatever and save a little time.
But yeah it’s not going to know anything too technical, it’s not going to know anything about less common problems, it’s not going to know the best algorithm to use, etc. But if you’re just using it for some basic ass shit, it works well enough.
Wikipedia is usually sourced when it comes to scientific stuff, so it’s a reliable source of information (usually).
But yeah, for simple and mundane stuff you can check and know instantly if it’s good or not, I totally understand. But for more complex or unknown advanced stuff, not so much.
yea combined with the possibility that it will become the next generation’s go to source for easy info, we are screwed