As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • @[email protected]OP
    link
    fedilink
    69 months ago

    Unfortunately the interface of the medical records system will be changed when this is implemented. The keyboard input method will be entirely removed.

    • ubergeek77
      link
      fedilink
      89 months ago

      Even if this gets implemented, I can’t imagine it will last very long with something as completely ridiculous as removing the keyboard. One AI API outage and the entire office completely shuts down. Someone’s head will roll when that inevitably happens.

      • @[email protected]OP
        link
        fedilink
        29 months ago

        Ah sorry, I mean removing the option of using the keyboard as an input method in the medical records system. The keyboard itself isn’t physically removed from the computer clients.

        But I agree that in the event of a system failure the hospital will halt.

        • ubergeek77
          link
          fedilink
          49 months ago

          Also, if you get the permission of someone in leadership to clone their voice, one angle could be to voice clone someone on ElevenLabs and make the voice say something particularly problematic, just to stress how easily voice data can be misused.

          If this AI vendor is ever breached, all they have to do is robocall patients pretending to be a real doctor they know. I don’t think I need to spell out how poorly that would go.