当前位置: X-MOL 学术American Business Law Journal › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Text Mining for Bias: A Recommendation Letter Experiment
American Business Law Journal ( IF 1.743 ) Pub Date : 2022-04-06 , DOI: 10.1111/ablj.12198
Charlotte S. Alexander

This article uses computational text analysis to study the form and content of more than 3000 recommendation letters submitted on behalf of applicants to a major U.S. anesthesiology residency program. The article finds small differences in form and larger differences in content. Women applicants' letters were more likely to contain references to acts of service, for example, whereas men were more likely to be described in terms of their professionalism and technical skills. Some differences persisted when controlling for standardized aptitude test scores, on which women and men scored equally on average, and other applicant and letter-writer characteristics. Even when all explicit gender-identifying language was stripped from the letters, a machine learning algorithm was able to predict applicant gender at a rate better than chance. Gender stereotyped language in recommendation letters may infect the entirety of an employer's hiring or selection process, implicating Title VII of the Civil Rights Act of 1964. Not all gendered language differences were large, however, suggesting that small changes may remedy the problem. The article closes by proposing a computationally driven system that may help employers identify and eradicate bias, while also prompting a rethinking of our gendered, racialized, ableist, ageist, and otherwise stereotyped occupational archetypes.

中文翻译:

偏见的文本挖掘:推荐信实验

本文使用计算文本分析来研究代表申请人向美国主要麻醉学住院医师计划提交的 3000 多封推荐信的形式和内容。文章发现形式上的细微差别和内容上的较大差别。例如,女性申请人的信中更有可能包含对服务行为的提及,而男性更有可能被描述为他们的专业精神和技术技能。在控制标准化能力倾向测试分数(女性和男性平均得分相同)以及其他申请人和写信人特征时,一些差异仍然存在。即使从字母中删除了所有明确的性别识别语言,机器学习算法也能够以比机会更好的速度预测申请人的性别。推荐信中的性别刻板语言可能会影响雇主的整个招聘或选拔过程,这与 1964 年民权法案第七章有关。然而,并非所有性别语言差异都很大,这表明微小的变化可能会解决这个问题。文章最后提出了一个计算驱动的系统,该系统可以帮助雇主识别和消除偏见,同时也促使我们重新思考我们的性别、种族、能干、年龄歧视和其他刻板的职业原型。
更新日期:2022-04-06
down
wechat
bug