• Nightwatch Admin@feddit.nl
    link
    fedilink
    English
    arrow-up
    19
    ·
    8 days ago

    Hahaha. April 1st is early this year.
    They are never going to make enough money by selling licenses and subscriptions for the cost of their current models (smarter people than me have made good estimates), let alone the future ones. Those future models are at a much worse performance-cost ratio. Ads will at best bring in about 1 usd per user per month (estimated by Facebook revenue and number of users) - double or triple it just for lolz, and they would still be losing money.
    So… how will this be pulled off? Only wrong answers!

    • futatorius@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      They’ll upgrade the Aibo and stick Altman’s face on it. People in offices can enjoy kicking it.

    • Nikelui@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      8 days ago

      Have a partnership with Microsoft and ship Windows 12 as the new “AI only” OS. Every command must go through ChatGPT to work. Then push updates to older Win11 OS to make them unusable.

        • themurphy@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 days ago

          They don’t care if they earn money the next 5-7 years.

          And they will hit the point of a great model doing human work for less than a monthly salary. It’s just a matter of time.

          • fine_sandy_bottom@lemmy.federate.cc
            link
            fedilink
            English
            arrow-up
            9
            ·
            8 days ago

            I’m incredulous.

            There was that thread asking what people are using LLMs for and it pretty much came down to “softening language in emails”.

            For most jobs LLMs can provide a small productivity bump.

            IMO if an LLM can do most of your job then you’re not producing much value anyway.

          • curbstickle@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 days ago

            Without enough funding, they absolutely will care.

            Thats between $33 billion and $47 billion at current costs. Someone needs to fund that.

            I’d also note that their models seem to be getting worse, with outright irrelevant answers, worse perfoemance, failures in following instructions, etc. Stanford and UC Berkeley did a months-long comparison, and even basic math is going downhill.

          • purrtastic@lemmy.nz
            link
            fedilink
            English
            arrow-up
            4
            ·
            8 days ago

            LLMs are not advancing enough any more. There just isn’t any more useful human generated text to train new models on. The net is already full of AI generated slop. OpenAI currently spends 2.35 USD to make 1 USD. It’s fundamentally unsustainable.

            • themurphy@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              ·
              8 days ago

              It costs 1 billion dollars to develop solar cells before they even sell the first product.

              The costed 100.000 dollars when starting to sell.

              They go for under 10 bucks per square today.

              And it’s like that for any technology ever invented.

              • futatorius@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 days ago

                It’s also like that for nearly every technology that has failed. For every Amazon that ran in the red until it grabbed enough market share to make a profit, there are 1000 firms that went tits-up, never having turned a profit. (Actual constant may vary from 1000, but it’s pretty damn big regardless).

              • Nightwatch Admin@feddit.nl
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 days ago

                Yes, but solar cells are in the end very simple products made of very simple resources, with a limited task: concerting one type of energy into another. That said, there is still research in making them more efficient and cheaper, and the that research isn’t cheap.
                But generative AI / LLM takes an insane amount of resources to train and maintain, is complex to create, with a very complex task, and a slight increase in quality takes progressively more resources (like, say 10% better would be 50% more energy use - I don’t have the numbers anymore but iirc they were even worse). A better LLM would therefore be much, much more expensive while people are apparently already underwhelmed with the latest models. With the growing competition, fast rising costs and meagre quality updates, while already unable to financially sustain themselves right now, I truly don’t see it. Honestly, this is why I think Microsoft is cramming their subpar Copilot into everything - to sort of justify all the money they pumped into this.