• CameronDev
    link
    fedilink
    arrow-up
    24
    ·
    1 year ago

    Dunno that we need much more experimentation, we all know what the outcome will be - Loads of sexual harassment and misogyny.

    Also, the stastically correct way to “sound like a girl” is to mute and never speak up. Last time i read something on this topic, the gender balance in a lot of games is better than we think, but women overwhelmingly stay muted to avoid harassment.

    The solution isnt more education/experimentation. Everyone who cares already knows what the problem is. Games need better moderation tools and clear community standards.

    • haui@lemmy.giftedmc.com
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      I was very glad to read the last sentence. I agree fully. Easiest would be a report button that saves the last 60 seconds of voice, analyzes it with ai and check if something illegal/harassing was said and autokicks the person who said it.

      Would not require more top down systems.

      • CameronDev
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I personally lean more towards humans for moderation, as words alone dont convey the full intent and meaning. And this cuts both ways, benign words can be used to harass.

        But of course, humans are expensive, and recordings of voice chat have privacy implications.

        • haui@lemmy.giftedmc.com
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          generally, yes. But computers can take care of stuff very well at this point. Kicking someone for using the N-Word does not need meaning. Just dont use it, even if it is for educational purposes (inside a game-chat for example).

          and recordings of voice chat have privacy implications.

          I dont think we live in the same reality. over 30% in the US use Voice assistants that constantly listen in to their conversatoins (was just the first number I could find, I’m not from the US). Having a bot in a game VC chat store 1 minute of text for 1 minute for reporting purposes is like 0.00001% of what is going wrong with security stuff. Billions of people are getting analyzed, manipulated and whatnot on a daily basis. A reporting tool is not even the same game, let alone in the same ballpark in terms of privacy implications.

          • CameronDev
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            Yeah, AI to knock out the egregious stuff (n-bombs etc) is prefectly reasonable. But there is still a lot of harassment that can happen the really needs a human to interpret. Its a balance.

            The privacy i am thinking of is the legal side of things. Google/FB/Apple are huge companies with the resources to work through the different legal requirements for every state and country. Google/FB/Apple can afford to just settle if anything goes wrong. A game studio cannot always do the same. As soon as you store a recording of a users voice, even temporarily, it opens up a lot of legal risks. Developers/publishers should still do it imo, but i dont think its something that can just be turned on without careful consideration.

      • GregorGizeh@lemmy.zip
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        1 year ago

        Yeah that sounds totally reasonable and unintrusive, wtf. I don’t want my every word spoken in voice to be live analyzed by ai to see if I did a wrongthink.

        Why not simply mute or kick if someone is being an asshole? Has served me well in all my years using discord or teamspeak.

        • haui@lemmy.giftedmc.com
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Apart from what you‘re interpreting into my words, I said if someone is harassing you or speaking about lets say the things they did with their daughter yesterday, you can report them and have a computer look into it instead of a human.

          Whatever privileges you have in your discord, you cant kick just anyone in every place. You either need privileges or a moderator to do it normally and my idea was to use AI to analyze the reported stuff.

          • GregorGizeh@lemmy.zip
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I completely understand the sentiment of protecting children, but at the same time under that argument you can push the most dystopian and intrusive, overreaching legislature imaginable. It is the old balance of freedom versus safety, we can’t have complete safety without giving up all freedom.

            And I think a constant ai driven monitoring of everything people say in the general vicinity of a microphone is very dystopian; which would be the eventual outcome of this.

            • haui@lemmy.giftedmc.com
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              I’m just gonna repeat myself since this is the most common answer I get in those topics:

              The vast majority of people is being listened in on, analyzed and manipulated on a daily basis by far, far worse actors. Storing 1 minute of VC for 1 minute only accessible to this hypothetical bot *if someone reports them - facing wrongful report consequences themselves is not comparable to real privacy threats.

              • GregorGizeh@lemmy.zip
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                You don’t need to repeat yourself (and neither, be this condescending), I am well aware that this is happening to some degree already. Doesn’t mean I have to happily concede the little that is left.

                • haui@lemmy.giftedmc.com
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 year ago

                  You‘re again interpreting something into my words that I didnt say. Maybe try not to play the victim in every comment. It’s abrasive.

                  It’s not happening to some degree. Its happening left right and center. Denying that a computer would help with vc moderation does not help at all.

                  Good day.

                  • GregorGizeh@lemmy.zip
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    1 year ago

                    Right back at ya buddy. I’m not putting words in your mouth.

                    And no matter how often times you repeat it, my discord call doesn’t constitute a threat to public safety.

    • onlinepersonaOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      It’s not about experimentation, but awareness. Experiencing life as a woman IRL is not easy - you can’t get a sex change on a whim or quickly hop into a female body. In an online game however, changing your voice is the probably the most convincing way to do so and it’s quite easy.
      If even a small percentage of men experiencing the other side of the coin became active in improving the gaming space, it would be something.

      Waiting and hoping for better moderation tools and clear community standards is non-active course of “action”. It’s like saying "I’m not going to vote because the system is shit 🙅 " and expecting it to get better.

      CC BY-NC-SA 4.0

      • CameronDev
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I appreciate what your saying, and you’re right that it is a passive course of action (unless one were to campaign/lobby for developers to implement moderation). But my point was that imo, everyone that cares about the problem is already aware of it, and more awareness doesn’t solve the problem either.

        This has been a problem for decades, and pre-dates microphones and games. Any platform that allows users to send messages will be used to send abuse. The tried and true solution has always been moderation. Riot Games seemed to be making headway with their chat moderation tools, but i havent kept up with how that went.

        At a certain point, awareness becomes preaching to the choir. The assholes who are causing the problem won’t change their behavior unless they are forced to.

        • onlinepersonaOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          But my point was that imo, everyone that cares about the problem is already aware of it, and more awareness doesn’t solve the problem either.

          I’m not sure that’s true. Yes people who care are aware, but I’d argue there are many who don’t now and aren’t aware. I for example didn’t know the impact was measurable in performance. My gullet has been open a few times while gaming online and the regret kicked in not long after, but using my mic has been so rare, I wouldn’t have been able to tie the shitty responses to decreased performance.
          It wouldn’t surprise me if the “gamer girls suck because they’re women” crowd joining the challenge figured out that they were part of the performance problem. That is if they had the ability to self-reflect, which probably the minority has.

          At a certain point, awareness becomes preaching to the choir. The assholes who are causing the problem won’t change their behavior unless they are forced to.

          Oh, in that regard, yes, I agree. At the very base level, assholes will be assholes and those people can only be forced or kicked out

          • CameronDev
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Yeah, i could be wrong about the level of awareness, i am a datapoint of one.

            The performance part is interesting, but almost irrelevant imo. If the results had been that abusing your teammates improves their performance, it would still be wrong to do it.

            I worry the people causing this problem are more likely to take the “abuse == tilt” information and use it to justify their behaviour :(.

            • onlinepersonaOP
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              I worry the people causing this problem are more likely to take the “abuse == tilt” information and use it to justify their behaviour :(.

              A valid concern 😅