As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • FlappyBubble@lemmy.mlOP
    link
    fedilink
    arrow-up
    2
    ·
    10 months ago

    The problem with incorrect transceiption exists with my secretary too. In the system I work in the secretary write my recordibg, sends it to me, I read it. I can edit the text at this point and then digitally sign it with a personal private key. This usually happens at least a day after being recorded. All perscriptions or orders to my nurses are given inannother system besides the raw text in the medical records. I can’t easily explain the practical workings but I really don’t see that the AI system will introduce more errors.

    But I agree that in the event of a system failure, there will be a catastrophic situation.