• trent@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Mostly I just hate the dialogue that people think WE are akin to a language model, where prompts go in and actions come out… it’s a gross misrepresentation of the human brain, and we don’t even know much of how it works, but we do know we don’t spit outputs based solely on inputs.

    • wasntme@yiffit.net
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      we don’t spit outputs based solely on inputs.

      I mean, that depends on what you define as an input, but unless you believe in some magic juice that makes us special (like a soul), then yeah, we are just responding based on inputs.

      Now, that doesn’t mean AIs work like our brains, responding based on inputs is pretty much applicable to everything, that’s simply what determinism is, action leads to reaction. It’s the process in between taking an input and giving an output that matters.

    • AdmiralRob@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I mean, we kinda do, we just take in so much more inputs than we could ever reasonably put in a computer, over a much longer time period, and with processing done on those inputs that isn’t entirely understood.

    • lugal@sopuli.xyzOP
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Yeah, that’s just bullshit. It isn’t even logical coherent. If we are all just reacting, where is the input supposed to come from?

      • CheeseNoodle@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I believe at least one attempt to put a serious definition to sapience defined it as a process in which a creature reacts not only to external input but also to internally generated input. Granted you can meet that definition by tacking a random number generator to the inner workings of any LLM.