There’s also more example videos on the technical report

Personal take: If they didn’t say how the videos on the page were created, I genuinely think that several of the AI generated videos could be passed off as being made with a camera or CGI (though there’s probably still inconsistencies when looking hard enough).

This failure example is quite amusing.

    • Sonori@beehaw.org
      link
      fedilink
      arrow-up
      16
      ·
      9 months ago

      I like the bright warning signs, they make a nice whooshing sound as we speed past.

  • RIPandTERROR
    link
    fedilink
    arrow-up
    4
    ·
    9 months ago

    All human discovery will be fed to the orphan crushing machine. It’s a law of science.

  • darkphotonstudio@beehaw.org
    link
    fedilink
    arrow-up
    4
    ·
    9 months ago

    This is a serious question, I’d love to hear some other views on this: should there be laws that assess new tech before it is allowed into the public sphere? How would such things be enforeced?

    • maxsettings@lemmy.ca
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      9 months ago

      I don’t think this would work since most governments don’t understand technology well (just look at the Flipper Zero ban in Canada as an example). Technology has also been disruptive to existing industries (Uber, Airbnb, Netflix, etc.). I think traditional industries would just end up lobbying governments when they are challenged by new technology companies and we’d see less technology overall. That being said I can see the need for more tech regulation in a lot of areas (looking at you Apple), I just can’t see a blanket solution being the right approach.

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      9 months ago

      The Amish have a society like that already, you can check out how it works.

  • lichtmetzger@feddit.de
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    9 months ago

    It’s quite frightening to see how fast these AI models have improved during the last few years. You can still spot errors in the videos, but how long will it take until you can’t do that anymore?

    It sounds terrifying to not know what’s real or not anymore. And also, these videos will put a lot of people out of jobs, especially in the creative industry. Who needs someone following a car with a drone anymore, when you can just generate that footage on the fly?