Skip to main content Skip to main navigation

Publikation

The Effect of Gender De-biased Recommendations — A User Study on Gender-specific Preferences

Thorsten Krause; Lorena Göritz; Robin Gratz
In: CHI '25: Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. ACM International Conference on Human Factors in Computing Systems (CHI-2025), Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, April 26 - May 1, Yokohama, Japan, Pages 1-16, No. 1000, ISBN 9798400713941, Association for Computing Machinery, New York, NY, USA, 5/2025.

Zusammenfassung

Recommender systems treat users inherently differently. Sometimes, however, personalization turns into discrimination. Gender bias occurs when a system treats users differently based on gender. While most research discusses measures and countermeasures for gender bias, one recent study explored whether users enjoy gender de-biased recommendations. However, its methodology has significant shortcomings; It fails to validate its de-biasing method appropriately and compares biased and unbiased models that differ in key properties. We reproduce the study in a 2x2 between-subjects design with n = 800 participants. Moreover, we examine the authors’ hypothesis that educating users on gender bias improves their attitude towards de-biasing. We find that the genders perceive de-biasing differently. The female users —the majority group— rate biased recommendations significantly higher while the male users —the minority group— indicate no preference. Educating users on gender bias increased acceptance non-significantly. We consider our contribution vital towards understanding how gender de-biasing affects different user groups.

Projekte

Weitere Links