You willingly give them key points of information about yourself, directly or indirectly. They then read those signals and use your own information against you to convince you they have answers. And they are often wrong, but you walk around repeating their “insights” as if they are true.

  • Blue_Morpho@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    1 month ago

    I’m not sure what you are trying to get out of AI? Therapy?

    I used it to help write an Excel script for my wife because I didn’t want to learn VBScript. It worked amazingly. I’ve talked to other programmers who have done the same. Another friend won a multi million dollar contract with a bid where chatgpt filled in all the boilerplate saving him hours of work.

    AI is a time saver like a pocket calculator. It’s not going to think for you. But the productivity gains are real.

    • Ton the Supermassive@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 month ago

      past few months, Google has been absolutely dogshit with search results - almost everything i search for is hidden, behind an ad or just convoluted somehow. So I started using chatgpt for almost everything - it’s so much better than Google.

    • Snapz@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      17
      ·
      1 month ago

      Cool. You’re different, you’re not like the other girls…

      You seem to have main character syndrome though. You think your edge case (or the claimed example of your friend as a professional foolishly trusting their future to the hallucinating robot) is the thing that drives this AI moment, but no. The VAST majority of marketed use cases and associated revenue gained from the BROAD population is for much more trivial applications.

      You also need a little more ability to process an abstract thought here, the “customer” of the palm reader here is not me or any individual. It’s humanity collectively. We tell it about what “the catcher in the rye” is and it steals and internalizes that human work. It then, very often, incorrectly reinterprets human thought back to us, but while mentioning enough minor details, which we provided prior, for it to better convince us of its confident lies.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        arrow-up
        12
        arrow-down
        1
        ·
        1 month ago

        Starting off with an insult, nice.

        “claimed” ??? As if I’m making up a story because I’m being paid by Sam Altman?

        “Trusting their future to hallucinating”

        My sales support engineering friend of course didn’t just copy whatever chatgpt wrote. He proofread it and fixed it. It still saved hours over starting from nothing and then still needing to proofread.

        I SAID IT WON’T THINK FOR YOU.