No, this is not a Black Mirror episode.

  • Ragnell@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I find the “they didn’t have permission to train from” argument is complete bunk. That’s not a right granted by intellectual property laws; there is no “right to control who learns from a work”.

    Yeah, but is an AI LEGALLY learning? Or is it just a machine that spits out output based on its inputs? In that case, use of the work as input isn’t allowed under the copyright, which is that the work be used by reading it.

    All these comparisons between what an AI is doing and what a human does when reading/learning/etc are not a given in a court of law. We don’t have any rulings yet that an AI is actually “learning” like a human when it is “trained.”

    “Training” an AI is building a tool. A tool that can be used to profit. Can artistic works be used to build a for-profit tool without permission?

    This is something that needs to be decided, and it will be decided in a way that whatever the rules are for AI can’t be applied to a human. Meaning if there is a requirement for permission for use in machine learning, that won’t change that a human can learn from it. So the comparison is pointless, because there is no way the courts are going to rule that these things are legally indistinguishable from people.

    In the meantime, back to the original, there ARE precedents for use of performance because of recordings. That’s why the studios wanted that in the contract, they KNOW they cannot manipulate a person’s performance through AI without their express written permission. Is it REALLY so hard to believe this can be applied to writing or art? That they can’t use writing or art without the artist’s express permission.

    We may see a new kind of copyright soon that specifically disallows use for AI, and another that is open for use with AI. Something to replace Creative Commons on the internet.

    • effingjoe@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      There simply isn’t a right to control even training. That’s just not a thing. It would need a change to the law.

      • Ragnell@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        All right, I am not a lawyer but I’ve been around the internet long enough to know there is arguably a right to control learning and training. Because the Fair Use copyright law SPECIFICALLY allows for educational use. That means the default is that otherwise, it would not be allowed.

        A judge could easily rule that AI training is not covered under Fair Use, as it is being used to create a profitable tool.

        • effingjoe@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          That’s a right to make copies and distribute them for educational purposes. This is specifically not involving distribution of any kind. Arguably copyright law doesn’t even apply, but even under the broader term of “intellectual property”, it doesn’t hold up, even without trying to make a comparison between humans learning and AI training. (which is more of an analogy)

          Edit: and to be fair, I’m not a lawyer either, but IP law (especially regarding how terrible it has become) is kind of a hobby of mine. But I can’t claim to be any type of authority on it.

          • Ragnell@kbin.social
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Okay, well my hobby is ethics.

            And the thing is, if they are using works written by others to build an AI for profit without permission, that’s exploitation. Copyright law is horrible and exploited by corporations constantly. That doesn’t mean we shouldn’t cheer on the little guy when they try to use it to defend against exploitation by corporations. Because the big tech companies are exploiting creatives in their drive to build and sell this tool. They are exploiting creatives to make their replacements. So I’m going to go off on any comparison analogy.

            Whatever the actual basis of the lawsuits against the AI companies, actual lawyers do think there’s a basis in IP law to sue because a few high profile lawsuits have been filed. And clearly there is some legal basis to sue if they use AI to create using performances, or this contract would not have been proposed.

            • effingjoe@kbin.social
              link
              fedilink
              arrow-up
              0
              ·
              1 year ago

              If you’re leaning on morality, then the comparison to humans becomes relevant again.

              Lawyers taking a high profile case is not any indication to go by.

              I could be off base here, but are you financially impacted if AI starts making commercial art? Like, is that how you make income, too?

              • Ragnell@kbin.social
                link
                fedilink
                arrow-up
                0
                ·
                1 year ago

                I have skills besides technical writing, but it’s one of the things I rely on to get hired. So yeah, I’m partially on the chopping block prior to creative writing. And it’s a serious problem that all the writing I’ve done on the internet is being used to train AI.

                But the thing about the comparison to humans in morality is there’ll be a line that gets crossed. And once that line is crossed, you can’t OWN an AI anymore, and you certainly can’t sell it. Up until then, you have to treat it as a tool.

                The end solution is going to be something along the lines of a Creative Commons license where you specific if your work can be used to train AI, if it can’t, or if it can only be used to train non-profit AI.

                • effingjoe@kbin.social
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  1 year ago

                  I don’t follow why calling it a tool matters. If a python script renders someone’s job redundant (hypothetically; this is unlikely in reality) does it matter if the script was written by a human or a LLM?

                  • Ragnell@kbin.social
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    1 year ago

                    @effingjoe I imagine it matters to the person who wrote it. Were THEY paid for this?

                    I mean, it’s a shitty thing that consultants and such remove jobs, but at the very least the exploitation there is only on one side, the poor guy kicked out. If an LLM is removing someone’s job, then the people used to train the LLM are getting exploited too.

                    Plus, a certain amount of the law is for deterrence. We don’t want the companies replacing creatives with AI. It would be beneficial to discourage that. We DO want things like fruit-picking and weeding and other backbreaking manual labor replaced by AI, so we can push for laws that encourage THAT. But right now they are trying to replace the wrong end.