When not astutely overseen by human intelligence, deploying AI can often bend into an unseemly rainbow of discriminatory qualities. That’s because biases unnoticed in the input data can become amplified in the outputs. Having your preferences catered to can sometimes be great. But it can also be debilitating in insidious ways, like in the search to find the “right” therapist.
Among racial and ethnic minorities, there tends to be a preference for providers who share their identity. A 2015 study demonstrated that individuals would knowingly seek less-effective therapy in exchange for sharing identity features with their therapist. The overlooked danger in using artificial intelligence or other tools to prioritize identity features when selecting a mental health provider is that it could systematically amplify a potential confirmation bias for some individuals seeking therapy.
The idea of establishing a relationship with a therapist who is outside a patient’s natural comfort zone may seem insensitive to patient autonomy. The successful patient/provider relationship is more dependent on a multi-dimensional interpersonal connection than on one or a few demographic features. Any homogenization of care could have the potential to significantly worsen access to care for many minority patients.
https://www.statnews.com/2019/09/20/artificial-intelligence-tool-finding-mental-health-therapist/