• Feyd@programming.dev
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    17 hours ago

    You also have to run the model with the input to determine what the output will be, no way to determine it BEFORE running. With a deterministic system, if you know the code you can predict the output with 100% accuracy without ever running it.

    This is not the definition of determinism. You are adding qualifications.

    I did look it up and I see now there are other factors that aren’t under your control if you’re using a remote system, so I’ll amend my statement to say that you can have deterministic inference systems, but the big ones most people use cannot be configured to be by the user.

    • GreenBeanMachine@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      17 hours ago

      Deterministic systems are always predictable, even if you never ran the system. Can you determine the output of an LLM with zero temperature without ever having ran it?

      And even disregarding the above, no, they are still NOT deterministic systems, and can still give different results, even if unlikely. The variation is NOT absolute zero when the temperature is set to zero.

      • Feyd@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        16 hours ago

        Deterministic systems are always predictable, even if you never ran the system. Can you determine the output of an LLM with zero temperature without ever having ran it?

        You don’t have to understand a deterministic system for it to be deterministic. You are making that up.

        And even disregarding the above, no, they are still NOT deterministic systems

        I conceded that setting temperature to 0 for an arbitrary system (including all the remote ones most people are using) does not mean it is deterministic after reading about other factors that influence inference in these systems. That does not mean there are not deterministic implementations of LLM inference, and repeating yourself with NO additional information and using CAPS does NOT make you more CORRECT lol.

        • GreenBeanMachine@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          16 hours ago

          I conceded that…

          So you admit that you were wrong to begin with. And now you’re just grasping at straws to not be completely wrong.

          repeating yourself with NO additional information and using CAPS does NOT make you more CORRECT lol.

          Right back at you buddy.

          • Feyd@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            16 hours ago

            I said I was wrong in that my statement was overly broad and not applicable to the systems most people are using in my initial response to you, then clarified that it is not an intrinsic character of the technology at large but that the implementations that are most used have it.

            You apparently think that conversations are a battle with winners and losers so the fact you were right that the biggest systems are nondeterministic for reasons outside of temperature configuration means it doesn’t matter why, doesn’t matter that those factors don’t have to apply to every inference system, and doesn’t matter that you have no idea what determinism means.

            In any case talking to you seems like a waste of time, so enjoy your sad victory lap while I block you so I don’t make the mistake of engaging you assuming you’re an earnest interlocutor in the future.