Autocomplete is a popular search feature that predicts queries based on user input and guides users to a set of potentially relevant suggestions. In this study, we examine how YouTube autocompletes serve as an information source for users exploring information about race. We perform an algorithm output audit of autocomplete suggestions for input queries about four racial groups and examine the stereotypes they embody. Using critical discourse analysis, we identify five major sociocultural contexts in which racial biases manifest -- Appearance, Ability, Culture, Social Equity, and Manner. Our results show evidence of aggregated discrimination and interracial tensions in the autocompletes we collected and highlight their potential risks in othering racial minorities. We call for urgent innovations in content moderation policy design and enforcement to address these biases in search outputs.
翻译:暂无翻译