Health Affairs January 19, 2022
Michael Sun, Tomasz Oliwa, Monica E. Peek, and Elizabeth L. Tung

Abstract

Little is known about how racism and bias may be communicated in the medical record. This study used machine learning to analyze electronic health records (EHRs) from an urban academic medical center and to investigate whether providers’ use of negative patient descriptors varied by patient race or ethnicity. We analyzed a sample of 40,113 history and physical notes (January 2019–October 2020) from 18,459 patients for sentences containing a negative descriptor (for example, resistant or noncompliant) of the patient or the patient’s behavior. We used mixed effects logistic regression to determine the odds of finding at least one negative descriptor as a function of the patient’s race or ethnicity, controlling for sociodemographic and health characteristics. Compared with White...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: EMR / EHR, Equity/SDOH, Health IT, Healthcare System, Patient / Consumer, Provider, Survey / Study, Technology, Trends
Cutting and pasting (aka cloned) EHR portions can lead to false claims
What Impact is AI Having on the Collection and Analysis of RWE?
Addressing Electronic Health Record Contributions To Diagnostic Error
Oracle plans to move headquarters to healthcare hub Nashville
InnovationRx: Chelsea Clinton Invests In 24/7 Pediatrics Startup

Share This Article