• seang96@spgrn.com
    link
    fedilink
    arrow-up
    2
    ·
    15 hours ago

    I thought training a model from AI data reduced its effectiveness? Wouldn’t this mean they still did something crazy since they got the opposite results?

    • Drew@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      14 hours ago

      I don’t think it’s training from AI data, but rather distillation: which tries to mimic another model

      So there’s a difference in what’s happening, one is taking the data as input and trying to form something new, while the other is trying to recreate the input

      • seang96@spgrn.com
        link
        fedilink
        arrow-up
        3
        ·
        13 hours ago

        Ah I saw it mentioned but paywall blocked the rest lol

        Distillation is basically reverse engineering for AI cool.