• Chozo@fedia.io
    link
    fedilink
    arrow-up
    90
    arrow-down
    3
    ·
    5 months ago

    Microsoft’s much-heralded Notepad.exe was storing files as plain text

    Same level of security concern. Quit putting your sensitive data into apps that aren’t meant for it.

    • NutWrench@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      5 months ago

      Yup. Especially apps that are pushing the wonders of cloud services to share that data everywhere.

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    10
    ·
    5 months ago

    Microsoft’s much-heralded Word app was storing documents as unencrypted DOCX files leaving them viewable by any malware.

    • drawerair@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      5 months ago

      We mustn’t enter any private info in a large language model (llm) in the 1st place. The conversations are probably used to train ai models.

      There should be 2 disclaimers in any llm –

      1. The llm’s responses aren’t always based on facts. It can say wrong info sometimes.

      2. Users mustn’t enter any private info in the llm.

  • normalexit@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    1
    ·
    5 months ago

    So many apps use sqlite or json files for storage without encryption; this doesn’t seem like much of a discovery.

    In any case, don’t share PII or any of your deepest, darkest secrets with it.

  • rottingleaf@lemmy.zip
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    3
    ·
    5 months ago

    I store almost everyfuck in plain text, so what?

    Oh, somebody wants to use techbro stuff and expect security.

    • soulfirethewolf@lemdro.id
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      Many people now use ChatGPT like they might use Google: to ask important questions, sort through issues, and so on. Often, sensitive personal data could be shared in those conversations.

      • Ragdoll X@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        5 months ago

        Don’t a lot of people also keep their tax information as plain text in their PC? If someone’s really worried about that stuff being leaked I think it’s on them to download VeraCrypt or smth, and also not to use ChatGPT for sensitive stuff knowing that OpenAI and Apple will obviously use it as training data.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        Well, there’s a good side to this - at least the recipe of that totally not poisonous green cocktail will be available from logs.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    5 months ago

    This is the best summary I could come up with:


    OpenAI announced its Mac desktop app for ChatGPT with a lot of fanfare a few weeks ago, but it turns out it had a rather serious security issue: user chats were stored in plain text, where any bad actor could find them if they gained access to your machine.

    As Threads user Pedro José Pereira Vieito noted earlier this week, “the OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in plain-text in a non-protected location,” meaning “any other running app / process / malware can read all your ChatGPT conversations without any permission prompt.”

    OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

    OpenAI has now updated the app, and the local chats are now encrypted, though they are still not sandboxed.

    It’s not a great look for OpenAI, which recently entered into a partnership with Apple to offer chat bot services built into Siri queries in Apple operating systems.

    Apple detailed some of the security around those queries at WWDC last month, though, and they’re more stringent than what OpenAI did (or to be more precise, didn’t do) with its Mac app, which is a separate initiative from the partnership.


    The original article contains 291 words, the summary contains 211 words. Saved 27%. I’m a bot and I’m open source!

  • OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    13
    ·
    edit-2
    5 months ago

    Apple has been running ad campaigns about how “Safari is a private browser” lately. The irony of screwing this up, when they even sandbox your Downloads folder I’m an idiot

        • Thekingoflorda@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          5 months ago

          Why would apple care about the privacy implications of openAI? No one will blame Apple for privacy concerns arising because of them.

          • aeronmelon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            5 months ago

            Now that OpenAI’s technology is integrated all the way across Apple’s flagship software and flagship devices, I guarantee you people will blame Apple if OpenAI fumbles privacy even if just on their end.

            I’ve been hearing mixed reactions to Apple choosing OpenAI, because of recent drama and because of Sam Altman specifically. To me, it feels like a “keep your enemies closer” decision on Apple’s part because while the company sucks, they do have a competitive (potentially superior) service at the moment.

            And Apple has jack without some kind of partnership.

            • willya@lemmyf.uk
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              5 months ago

              Well the ChatGPT Mac app and the universal Siri AI are two different things.

              • aeronmelon@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                5 months ago

                Imagine if it was just the OpenAI app.

                The masses want AI, even if they don’t know why. And OpenAI is a big name, even if they make Google look privacy-conscious. The smartest thing for Apple to do is to funnel as many inevitable OpenAI users on their platforms through their own sanitized version of the service.

                • willya@lemmyf.uk
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  5 months ago

                  I don’t like to blindly imagine things like most of the Lemmy user base.