• BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Real material is being used to train some models, but sugesting that it will encourage the creation of more “data” is silly. The amount required to finetune a model is tiny compared to the amount that is already known to exist. Just like how regular models haven’t driven people to create even more data to train on.

    • SuddenlyBlowGreen@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      Just like how regular models haven’t driven people to create even more data to train on.

      It has driven companies to try to get access to more data people generate to train the models on.

      Like chatGPT on copyrighted books, or google on emails, docs, etc.

      • BetaDoggo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        And what does that have to do with the production of csam? In the example given the data already existed, they’ve just been more aggressive about collecting it.

        • SuddenlyBlowGreen@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Well now in addition to regular pedos consuming CSAM, now there are the additional consumers of people to use huge datasets of them to train models.

          If there is an increase in demand, the supply will increase as well.

          • BetaDoggo_@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Not necessarily. The same images would be consumed by both groups, there’s no need for new data. This is exactly what artists are afraid of. Image generation increases supply dramatically without increasing demand. The amount of data required is also pretty negligible. Maybe a few thousand images.