• @vzq
    link
    29
    edit-2
    8 months ago

    Do not mix code and input data.

    • glibg10b
      link
      fedilink
      168 months ago

      Right. I don’t know how the hell someone managed to reveal their OpenAI key to the LLM itself

      • @gaylord_fartmaster@lemmy.world
        link
        fedilink
        168 months ago

        I don’t think it gave him the openAI key, he just had the ability to send as many hijacked (not game related) prompts as he wanted through the game on the devs’ dime.

        • @computergeek125@lemmy.world
          link
          fedilink
          English
          38 months ago

          Which, now given the ability to inject arbitrary code, you could conceivably now write code to list every variable it had access to.

          • The text prompt in the game might also be vulnerable to arbitrary code injection, but that wouldn’t really have anything to do with the prompt injection being used here. Everything being done is within the confines of chatGPT which wouldn’t need or have access to any of the game’s code.

      • @DudeDudenson@lemmings.world
        link
        fedilink
        6
        edit-2
        8 months ago

        They didn’t. The point was that the guy could use their implementation freely as if he was paying for a chat gpt license. Basically he made the ai let him run any query he wanted trough it so he just has unlimited access to the paid version of chat gpt at the company’s expense