• BitSound@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    For your edit: Yes, that’s what’s known as the context window limit. ChatGPT has an 8k token “memory” (for most people), and older entries are dropped. That’s not an inherent limitation of the approach, it’s just a way of keeping OpenAI’s bills lower.

    Without an example I don’t think there’s anything to discuss. Here’s one trivial example though where I altered ChatGPT’s understanding of the world:

    If I continued that conversation, ChatGPT would eventually forget that due to the aforementioned context window limit. For a more substantive way of altering an LLM’s understanding of the world, look at how OpenAI did RLHF to get ChatGPT to not say naughty things. That permanently altered the way GPT-4 responds, in a similar manner to having an angry nun rap your knuckles whenever you say something naughty.