• Wispy2891@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    15 hours ago

    I have no experience with this ide but I see on the posted log on Reddit that the LLM is talking about a “step 620” - like this is hundreds of queries away from the initial one? The context must have been massive, usually after this many subsequent queries they start to hallucinating hardly

    • Wispy2891@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      27 minutes ago

      I explain what I mean: those algorithms have no memory at all. Each request is made on a blank slate, so when you do a “conversation” with them, the chat program is actually including all the previous interactions (or a resume of them) plus all the relevant parts of the code, simulating a conversation with a human. So the user didn’t just ask “can you clear the cache” but actually asked the result of 600 messages + kilobytes of generated code + “can you clear the cache”, and this causes destructive hallucinations