- cross-posted to:
- hackernews@lemmy.smeargle.fans
- cross-posted to:
- hackernews@lemmy.smeargle.fans
Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.
Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.
Ppl anthropomorphise LLMs way too much. I get it that at first glance they sound like a living being, human even and it’s exciting but we had some time already to know it’s just very cool big data processing algo.
It’s like boomers asking me what is computer doing and referring to computer as a person it makes me wonder will I be as confused as them when I am old?
Oh, hi, second coming of Edgar Dijkstra.
He may think like that when using language like that. You might think like that. The bulk of programmers doesn’t. Also I strongly object the dissing of operational semantics. Really dig that handwriting though, well-rounded lecturer’s hand.
Don’t say those things to me. I have special snowflake disorder. I got literally high reading this when seeing a famous intelligent person has same opinion as me. Great minds… god see what you have done.
It’s only going to get worse now that ChatGPT has a realistic-sounding voice with simulated emotions.
Probably not about computers per se - like the Greatest generation knew a lot more about horses than the average person today - and similarly we know more about the things that have mattered to us over the course of our lifetimes.
What would get weird for us is if when we are retirement age - ofc we cannot ever retire, bc capitalism - and someone talks about the new horglesplort based on alien vibrations which are computer-generated from the 11th dimension of string theory and we are all like “wut!?”
fr fr no cap skibidi toilet rizz teabag
That said, humanity seems to not only have slowed down the accretion of new knowledge but actually gone backwards - children today won’t live as long as boomers did, and e.g. despite being on mobile devices all day long, most don’t have the foggiest clue of how computing works as in programming or even binary. So we will likely be confused in the opposite way as in “why can’t you understand this?”