• Coldcell@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    2 days ago

    Writing code to process data is absolutely not the same way anesthesiology works 😂 Comparing state specific logic bound systems to the messy biological processes of a nervous system is what gets this misattribution of ‘AI’ in the first place. Currently it is just glorified auto-correct working off statistical data about human language, I’m still not sure how a written program can have a voodoo spooky black box that does things we don’t understand as a core part of it.

    • irmoz@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      edit-2
      2 days ago

      The uncertainty comes from reverse-engineering how a specific output relates to the prompt input. It uses extremely fuzzy logic to compute the answer to “What is the closest planet to the Sun?” We can’t know which nodes in the neural network were triggered or in what order, so we can’t precisely say how the answer was computed.