Tornar a resultats destacats 2024

Beyond stereotypes: Building fair AI for mental health

Lekadir, Karim (UB)

Engineering Sciences

Depression affects millions worldwide, and artificial intelligence (AI) is increasingly used to predict who might be at risk. This has great potential for enabling early prevention in at-risk individuals. But what if these AI risk prediction tools discriminate against some individuals? We uncovered a troubling reality: standard AI models often reflect and reinforce societal biases, particularly in relation to sex, income, and ethnicity.Analysing data from four different populations, the research team found that standard AI models tend to reproduce existing stereotypes about depression, predicting it more frequently in women, individuals with lower incomes, and white populations. If left unaddressed, such biases could perpetuate existing mental health disparities, while denying support for those who need it most.To address this, our team tested various bias-correction techniques, including a novel post-hoc method that adjusts predictions after model training. While these approaches helped reduce AI biased, no single solution worked perfectly across all populations, highlighting the need for fairness assessments tailored to each individual and subgroup.This study sends a clear message: AI in mental health has great potential but must be carefully designed to avoid deepening existing inequalities. By prioritising fairness from the design phase, we can ensure AI-driven mental healthcare benefits everyone—regardless of sex, ethnicity, or socio-economic status.

AI model performance for depression prediction across sex, income, ethnicity, before and after the application of bias mitigation techniques, showing improved fairness across subgroups.


REFERÈNCIA

- Dang V N, Cascarano A, Mulder R H, Cecil C, Zuluaga M A, Hernández-González J & Lekadir K 2024, "Fairness and bias correction in machine learning for depression prediction across four study populations", Scientific Reports, 14 - 1 - 7848.