Cultural items like songs have an important impact in creating and reinforcing stereotypes, biases, and discrimination. But the actual nature of such items is often less transparent. Take songs, for example. Are lyrics biased against women? And how have any such biases changed over time? Natural language processing of a quarter of a million songs over 50 years quantifies misogyny. Women are less likely to be associated with desirable traits (i.e., competence), and while this bias has decreased, it persists. Ancillary analyses further suggest that song lyrics may help drive shifts in societal stereotypes towards women, and that lyrical shifts are driven by male artists (as female artists were less biased to begin with). Overall, these results shed light on cultural evolution, subtle measures of bias and discrimination, and how natural language processing and machine learning can provide deeper insight into stereotypes and cultural change.
翻译:诸如歌曲等文化项目在创造和强化陈规定型观念、偏见和歧视方面有着重要影响。但此类项目的实际性质往往不那么透明。 例如,请看歌曲。歌词是否对妇女有偏见?这种偏见如何随时间而改变?50年来25万首歌曲的自然语言处理方式对厌恶女人的现象进行了量化。女性不太可能与理想的特性(即能力)相联系,尽管这种偏见已经减少,但这种偏见依然存在。辅助性分析进一步表明,歌曲歌词可能有助于推动社会对妇女陈规定型观念的转变,以及由男性艺术家驱动的戏剧性转变(因为女性艺术家开始的偏见较少 ) 。 总体而言,这些结果揭示了文化演变、偏见和歧视的微妙措施,以及自然语言处理和机器学习如何能更深入地了解陈规定型观念和文化变革。