当前位置: X-MOL 学术Big Data & Society › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
“The revolution will not be supervised”: Consent and open secrets in data science
Big Data & Society ( IF 6.5 ) Pub Date : 2021-08-17 , DOI: 10.1177/20539517211035673
Coleen Carrigan 1 , Madison W Green 1 , Abibat Rahman-Davies 1
Affiliation  

The social impacts of computer technology are often glorified in public discourse, but there is growing concern about its actual effects on society. In this article, we ask: how does “consent” as an analytical framework make visible the social dynamics and power relations in the capture, extraction, and labor of data science knowledge production? We hypothesize that a form of boundary violation in data science workplaces—gender harassment—may correlate with the ways humans’ lived experiences are extracted to produce Big Data. The concept of consent offers a useful way to draw comparisons between gender relations in data science and the means by which machines are trained to learn and reason. Inspired by how Big Tech leaders describe unsupervised machine learning, and the co-optation of “revolutionary” rhetoric they use to do so, we introduce a concept we call “techniques of invisibility.” Techniques of invisibility are the ways in which an extreme imbalance between exposure and opacity, demarcated along fault lines of power, are fabricated and maintained, closing down the possibility for bidirectional transparency in the production and applications of algorithms. Further, techniques of invisibility, which we group into two categories—epistemic injustice and the Brotherhood—include acts of subjection by powerful actors in data science designed to quell resistance to exploitative relations. These techniques may be useful in making further connections between epistemic violence, sexism, and surveillance, sussing out persistent boundary violations in data science to render the social in data science visible, and open to scrutiny and debate.



中文翻译:

“革命不会被监督”:数据科学中的同意和公开秘密

计算机技术的社会影响经常在公共话语中得到美化,但人们越来越关注其对社会的实际影响。在本文中,我们问:“同意”作为一种分析框架如何使数据科学知识生产的捕获、提取和劳动中的社会动态和权力关系可见?我们假设数据科学工作场所中的一种边界侵犯形式——性别骚扰——可能与提取人类生活经验以产生大数据的方式相关。同意的概念提供了一种有用的方法来比较数据科学中的性别关系与训练机器学习和推理的方法。受到大型科技领导者如何描述无监督机器学习的启发,以及他们用来这样做的“革命性”修辞的增选,我们引入了一个我们称之为“隐形技术”的概念。隐身技术是制造和维护暴露和不透明度之间极度不平衡的方式,沿着权力的断层线划定,关闭了算法生产和应用中双向透明的可能性。此外,我们将隐身技术分为两类——认知不公正和兄弟会——包括数据科学中强大的参与者的服从行为,旨在平息对剥削关系的抵制。这些技术可能有助于在认知暴力、性别歧视和监视之间建立进一步的联系,找出数据科学中持续存在的边界违规,使数据科学中的社会可见,并接受审查和辩论。

更新日期:2021-08-17
down
wechat
bug