“If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible.”

  • @Touching_Grass@lemmy.world
    link
    fedilink
    English
    88 months ago

    I think its a numbers game. If fediverse had the numbers it would be plagued with all the same issues. But its a little fish in a big pond.

    • JustinHanaganOP
      link
      fedilink
      5
      edit-2
      8 months ago

      If a Fediverse instance grew so big that it couldn’t moderate itself and had a lot of spam/Nazis, presumably other instances would just defederate, yeah? Unless an instance is ad-supported, what’s the incentive to grow beyond one’s ability to stay under control?

        • @fubo@lemmy.world
          link
          fedilink
          English
          0
          edit-2
          8 months ago

          questionable pictures

          We need to keep distinguishing “actual, real-life child-abuse material” from “weird/icky porn”. Fediverse services have been used to distribute both, but they represent really different classes of problem.

          Real-life CSAM is illegal to possess. If someone posts it on an instance you own, you have a legal problem. It is an actual real-life threat to your freedom and the freedom of your other users.

          Weird/icky porn is not typically illegal, but it’s something many people don’t want to support or be associated with. Instance owners have a right to say “I don’t want my instance used to host weird/icky porn.” Other instance owners can say “I quite like the porn that you find weird/icky, please post it over here!”

          Real-life CSAM is not just extremely weird/icky porn. It is a whole different level of problem, because it is a live threat to anyone who gets it on their computer.

          • @ubermeisters@lemmy.world
            link
            fedilink
            English
            -3
            edit-2
            8 months ago

            No, let’s just say both are fucking creepy and not allow either thanks. Your desire to draw a line between them is sus also.

            • @fubo@lemmy.world
              link
              fedilink
              English
              4
              edit-2
              8 months ago

              You’d be surprised by how much of the Internet was built by furries, BDSM folk, and other people whose porn a lot of folks think is weird and icky.

              Also, you seem to have misunderstood the gist of my comment, or I wasn’t clear enough. The tools to deal with CSAM will of necessity be a lot stronger than content moderation that’s driven by users’ preferences of what they’d like not to see.

              • @ubermeisters@lemmy.world
                link
                fedilink
                English
                -3
                edit-2
                8 months ago

                The issue is your categorization, and either rhe rhought, or lack of thought, that went into making them: “real csam”, and “the icky stuff”

                When you categorized the first as “real” it leaves a gap for the rest of “fake” and “implied” CSAM, which me, the reader, is left assuming goes in your other category, especially since your other category has no specifics, and we all know what CSAM is.That was the logic behind my comment:

                “If somebody is tiptoeing around abusive material it’s because they want to view abusive material.”

                Also I find it suspect that you’ve characterized the issue with CSAM material being that you can get in trouble for owning it, not that it wrecks somebody’s fucking life to make…

                Honestly I think you would be better off deleting your comment completely. White knighting the term “questionable pictures” in a public forum isn’t a good look regardless of what you meant.

                • @fubo@lemmy.world
                  link
                  fedilink
                  English
                  2
                  edit-2
                  8 months ago

                  I’m talking about the necessities of moderation policy.

                  The things you think it’s “suspect” I’m not saying? Those are things I think are obviously true and don’t need to be restated. Yes, child abuse is very bad. We know that. I don’t need to say it over again, because everyone already knows it. I’m talking specifically about the needs for moderation here.

                  I’m pointing at the necessary distinction between “you personally morally object to that material” and “that material will cause the law to come down on you and your users and anyone who peers with you”.

                  You should have the ability to keep both of those off your server, but the latter is way more critical.


                  “White knighting”? Delete your account.