discharge = discharge from hospital

  • Xanjis@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    This is why automation is good. Humans can’t be trusted to do critical jobs such as doctor, lawyer, cop, teacher, or judge without being influenced by a bad experience they had a decade ago, what they ate last, their pending divorce or how much sleep they got last night.

    • tacticalsugar
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 months ago

      Honestly that just leads to automation with built-in bias, and now you can’t even threaten a doctor with a malpractice suit because you can’t talk to a person, or the only person you can talk to says “sorry, the computer won’t let me”.

      You can’t use technology to fix social issues. People keep trying, and every time it just hurts chronically ill and disabled people even more. Have you ever heard of NarxCare?

      NarxCare is a prescription drug monitoring program (PDMP) run by Bamboo Health. Bamboo Health was formerly known as Appriss. It is widely used across the United States by pharmacies including Rite Aid as well as those at Walmart and Sam’s Club. The NarxCare software allows doctors to view data about a patient, combining data from the prescription registries of various U.S. states to make the registries interoperable nationally. It also uses machine learning to generate an “Overdose Risk Score” that potentially includes EMS and criminal justice data; these scores have been criticized by researchers and patient advocates for the lack of transparency in the process as well as the potential for disparate treatment of women and minority groups.

      • Xanjis@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Sure you still have innate/learned biases but eliminating situational (recent divorce) and bodily (hunger/sleepy/horny/sick) bias entirely is still a massive reduction in the total amount of bias you face day to day. If anything being able to see the biases of the data going into something like NarxCare is a good thing because now you have a paper trail for improvements. You can’t just grab a hundred doctors and ask them “have you ever denied care due to your biases against women?” because the bad ones will either lie or not realize what they have done.

        • tacticalsugar
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          I would genuinely rather work with a doctor who just got divorced than have to fight the invisiable AI blackbox that calls me a drug addict for being chronically ill.

          You can’t just grab a hundred doctors and ask them “have you ever denied care due to your biases against women?” because the bad ones will either lie or not realize what they have done.

          Unlike Narxcare, which just denies care due to biases and won’t tell you why because it’s a machine learning blackbox. There is no “paper trail” for NarxCare, because denying care to patients is the point. I can at least argue with doctors, or request a new one.

          You can’t fix social issues with technology, and every attempt will just make things worse for the affected people.