• brisk@aussie.zone
    link
    fedilink
    arrow-up
    16
    ·
    15 hours ago

    I will explain what this means in a moment, but first: Hahahahahahahahahahahahahahahaha hahahhahahahahahahahahahahaha.

    • coyotino [he/him]@beehaw.orgOP
      link
      fedilink
      English
      arrow-up
      12
      ·
      17 hours ago

      right there with you. but seeing them scrambling after being so smug about their overpriced product has been nice.

    • verdare [he/him]@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      12 hours ago

      I find myself in an interesting situation because I want to abolish copyright and institute UBI. I don’t really think you can “steal” images on the internet, but seeing OpenAI whine about intellectual property now does bring some schadenfreude.

  • Pete Hahnloser@beehaw.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    20 hours ago

    404media usually does great work. The lack of editing here – missing crucial words, subject-verb agreement problems – makes this something I ducked out of halfway through.

  • seang96@spgrn.com
    link
    fedilink
    arrow-up
    2
    ·
    15 hours ago

    I thought training a model from AI data reduced its effectiveness? Wouldn’t this mean they still did something crazy since they got the opposite results?

    • Drew@sopuli.xyz
      link
      fedilink
      arrow-up
      6
      ·
      14 hours ago

      I don’t think it’s training from AI data, but rather distillation: which tries to mimic another model

      So there’s a difference in what’s happening, one is taking the data as input and trying to form something new, while the other is trying to recreate the input

      • seang96@spgrn.com
        link
        fedilink
        arrow-up
        3
        ·
        13 hours ago

        Ah I saw it mentioned but paywall blocked the rest lol

        Distillation is basically reverse engineering for AI cool.