• mindlesscrollyparrot@discuss.tchncs.de
    link
    fedilink
    arrow-up
    32
    arrow-down
    5
    ·
    1 day ago

    AIs take away attribution as well as copyright. The original authors don’t get any credit for their creativity and hard work. That is an entirely separate thing from ownership and property.

    It is not at all OK for an AI to take a work that is in the public domain, erase the author’s identity, and then reproduce it for people, claiming it as its own.

      • mindlesscrollyparrot@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        3 hours ago

        Is one of those things giving attribution? If I ask for a picture of Mount Fuji in the style of a woodblock print, can the AI tell me what its inspirations were?

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          it can tell you its inspiration about as well as photoshop’s content-aware fill, because it’s sort of the same tech, just turned to 11. but it depends.

          if a lot of the training data is tagged with the name of the artist, and you use the artist’s name to get that style, and the output looks made by that artist, you would be fairly sure who to attribute. if not, you would have to do a mathematical analysis of the model. that’s because it’s not actually associating text with images, the text part is separate from the image part and they only communicate through a sort of coordinate system. one part sees text, the other sees shapes.

          also, the size of the training dataset compared to the size of the finished model means that there is less than one bit stored per full image. the fact that some models can reproduce input images almost exactly is basically luck, because none of the original image is in there. it just pulls together everything it knows to build something that already exists.