Can Natural Language Processing Reveal Doctors’ Attitudes toward Specific Medical Conditions?

H. Brooke Scoles

Scoles-Masters-Thesis.pdf
Adobe Acrobat document [1.9 MB]

 

 

Abstract

In recent decades, an increasing quantity of economics research has been conducted on biased beliefs and their impact on labor markets, education outcomes, housing and other economic activities. The discrimination that results due to biased beliefs is difficult to observe. Still, a combination of econometric techniques and inventive experimental design have provided convincing evidence that such bias exists. Research into biased beliefs and discrimination has mostly focused on gender and race, while very little has been done on disability. Within existing studies of disability there is almost nothing on "hidden" disabilities or disease, with no papers examining the influence of bias on patients. Systemic biases in the judgment of patients, if they exist, can potentially influence their care. Hence, biases are difficult to measure in a lab setting because this environment obscures behavior that is not easily detectable or that is intentionally hidden. One way to approach such an exploration is to look for evidence in language in anonymous settings. Natural language approaches, including LASSO-logistic regression and emotion dictionaries, provide the tools for this mode of analysis. The analysis was focused on a specific disease, ME/CFS, because of characteristics that make it a good candidate for researching attitudes and biased beliefs. Using this approach, it appears that medical decisions regarding treatment are not entirely objective and that they are influenced by incorrect beliefs. The language used by medical professionals shows that doctors’ attitudes towards patients are not consistent but vary in line with different diseases. Such differences have economic implications, potentially lowering the quality of care, worsening health outcomes and lowering labor market productivity or participation.

Print Print | Sitemap
© Morten, 2020