• 1 Post
  • 88 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2023

help-circle

















    1. I think you’re on the wrong community for this question.

    2. The thing regularly referred to as “AI” of late is more accurately referred to as generative AI, or large language models. There’s no capacity for learning from humans, it’s pattern matching based on large sets of data that are boiled down to a series of vectors to give a most-likely next word for a response to a prompt. You could argue that that’s what people do, but that’s a massive over simplification. You’re right to say it does not have the ability to form thoughts and views. That said, like a broken clock, an LLM can put out words that match up with existing views pretty darn easily!

    You may be talking about general AI, which is something we’ve not seen yet and have no timeframe for existing. That may be able to have beliefs… But again, there’s not even a suggestion of that being close to happening. LLMs are (in my opinion) not even a good indicator or precursor to that coming soon.

    TL;DR: An LLM (or generative AI) can’t have or form beliefs.



  • jeeva@lemmy.worldtookmatewanker@feddit.ukbloody rigged
    link
    fedilink
    English
    arrow-up
    16
    ·
    3 months ago

    It’s clearly down to the increase in visible moustache-twirling villains in the news, saying stuff like “and I would have gotten away with it if it weren’t for those meddling kids.”

    Or something.

    Personally, I’m also annoyed at the increased use of “needs done”, which feels like it’s missing half a sentence. But hey, languages are big - and can fit a lot of different usage.