• haui@lemmy.giftedmc.com
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    generally, yes. But computers can take care of stuff very well at this point. Kicking someone for using the N-Word does not need meaning. Just dont use it, even if it is for educational purposes (inside a game-chat for example).

    and recordings of voice chat have privacy implications.

    I dont think we live in the same reality. over 30% in the US use Voice assistants that constantly listen in to their conversatoins (was just the first number I could find, I’m not from the US). Having a bot in a game VC chat store 1 minute of text for 1 minute for reporting purposes is like 0.00001% of what is going wrong with security stuff. Billions of people are getting analyzed, manipulated and whatnot on a daily basis. A reporting tool is not even the same game, let alone in the same ballpark in terms of privacy implications.

    • CameronDev
      link
      fedilink
      arrow-up
      3
      ·
      10 months ago

      Yeah, AI to knock out the egregious stuff (n-bombs etc) is prefectly reasonable. But there is still a lot of harassment that can happen the really needs a human to interpret. Its a balance.

      The privacy i am thinking of is the legal side of things. Google/FB/Apple are huge companies with the resources to work through the different legal requirements for every state and country. Google/FB/Apple can afford to just settle if anything goes wrong. A game studio cannot always do the same. As soon as you store a recording of a users voice, even temporarily, it opens up a lot of legal risks. Developers/publishers should still do it imo, but i dont think its something that can just be turned on without careful consideration.