Journal Home > Volume 2 , Issue 3

In response to public scrutiny of data-driven algorithms, the field of data science has adopted ethics training and principles. Although ethics can help data scientists reflect on certain normative aspects of their work, such efforts are ill-equipped to generate a data science that avoids social harms and promotes social justice. In this article, I argue that data science must embrace a political orientation. Data scientists must recognize themselves as political actors engaged in normative constructions of society and evaluate their work according to its downstream impacts on people’s lives. I first articulate why data scientists must recognize themselves as political actors. In this section, I respond to three arguments that data scientists commonly invoke when challenged to take political positions regarding their work. In confronting these arguments, I describe why attempting to remain apolitical is itself a political stance—a fundamentally conservative one—and why data science’s attempts to promote “social good” dangerously rely on unarticulated and incrementalist political assumptions. I then propose a framework for how data science can evolve toward a deliberative and rigorous politics of social justice. I conceptualize the process of developing a politically engaged data science as a sequence of four stages. Pursuing these new approaches will empower data scientists with new methods for thoughtfully and rigorously contributing to social justice.


menu
Abstract
Full text
Outline
About this article

Data Science as Political Action: Grounding Data Science in a Politics of Justice

Show Author's information Ben Green1( )
Society of Fellows and the Gerald R. Ford School of Public Policy, University of Michigan, Ann Arbor, MI 48109, USA

Abstract

In response to public scrutiny of data-driven algorithms, the field of data science has adopted ethics training and principles. Although ethics can help data scientists reflect on certain normative aspects of their work, such efforts are ill-equipped to generate a data science that avoids social harms and promotes social justice. In this article, I argue that data science must embrace a political orientation. Data scientists must recognize themselves as political actors engaged in normative constructions of society and evaluate their work according to its downstream impacts on people’s lives. I first articulate why data scientists must recognize themselves as political actors. In this section, I respond to three arguments that data scientists commonly invoke when challenged to take political positions regarding their work. In confronting these arguments, I describe why attempting to remain apolitical is itself a political stance—a fundamentally conservative one—and why data science’s attempts to promote “social good” dangerously rely on unarticulated and incrementalist political assumptions. I then propose a framework for how data science can evolve toward a deliberative and rigorous politics of social justice. I conceptualize the process of developing a politically engaged data science as a sequence of four stages. Pursuing these new approaches will empower data scientists with new methods for thoughtfully and rigorously contributing to social justice.

Keywords: data science, ethics, politics, social justice, social change, social good, pedagogy

References(108)

1
C. O' Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York, NY, USA: Broadway Books, 2017.
2
J. Angwin, J. Larson, S. Mattu, and L. Kirchner, Machine bias, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing, 2016.
3
V. Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York, NY, USA: St. Martin’s Press, 2018.
4

R. Wexler, Life, liberty, and trade secrets: Intellectual property in the criminal justice system, Stanford Law Review, vol. 70, no. 5, pp. 1343–1429, 2018.

5

Z. Obermeyer, B. Powers, C. Vogeli, and S. Mullainathan, Dissecting racial bias in an algorithm used to manage the health of populations, Science, vol. 366, no. 6464, pp. 447–453, 2019.

6

J. Buolamwini and T. Gebru, Gender shades: Intersectional accuracy disparities in commercial gender classification, Proceedings of the 1st Conference on Fairness,Accountability and Transparency, vol. 81, pp. 77–91, 2018.

7

A. D. I. Kramer, J. E. Guillory, and J. T. Hancock, Experimental evidence of massive-scale emotional contagion through social networks, Proceedings of the National Academy of Sciences, vol. 111, no. 24, pp. 8788–8790, 2014.

8

S. Vosoughi, D. Roy, and S. Aral, The spread of true and false news online, Science, vol. 359, no. 6380, pp. 1146–1151, 2018.

9
J. Nicas, How YouTube drives people to the internet’s darkest corners, https://www.wsj.com/articles/how-youtube-drives-viewers-to-the-internets-darkest-corners-1518020478, 2018.
10
M. Rosenberg, N. Confessore, and C. Cadwalladr, How Trump consultants exploited the facebook data of millions, https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html, 2018.
11

M. Kosinski, D. Stillwell, and T. Graepel, Private traits and attributes are predictable from digital records of human behavior, Proceedings of the National Academy of Sciences of the United States of America, vol. 110, no. 15, pp. 5802–5805, 2013.

12

Y. -A. de Montjoye, L. Radaelli, V. K. Singh, and A. S. Pentland, Unique in the shopping mall: On the reidentifiability of credit card metadata, Science, vol. 347, no. 6221, pp. 536–539, 2015.

13
S. A. Thompson and C. Warzel, How to track president trump, https://www.nytimes.com/interactive/2019/12/20/opinion/location-data-national-security.html, 2019.
14
B. Green, The contestation of tech ethics: A sociotechnical approach to technology ethics in practice, Journal of Social Computing, doi: 10.23919/JSC.2021.0018
15
W. L. Wang, Computer science, philosophy join forces on ethics and technology, https://www.thecrimson.com/article/2017/11/7/cs-philosophy-collab/, 2017.
16
N. Singer, Tech’s ethical ‘Dark Side’: Harvard, Stanford and others want to address it, https://www.nytimes.com/2018/02/12/business/computer-science-ethics-courses.html, 2018.
17

B. J. Grosz, D. G. Grant, K. Vredenburgh, J. Behrends, L. Hu, A. Simmons, and J. Waldo, Embedded EthiCS: Integrating ethics across CS education, Communications of the ACM, vol. 62, no. 8, pp. 54–61, 2019.

18
C. Fiesler, N. Garrett, and N. Beard, What do we teach when we teach tech ethics? A syllabi analysis, in Proc. the 51st ACM Technical Symposium on Computer Science Education (SIGCSE’20), Portland, OR, USA, 2020, pp. 289–295.https://doi.org/10.1145/3328778.3366825
DOI
19
C. Fiesler, Tech ethics curricula: A collection of syllabi, https://medium.com/@cfiesler/tech-ethics-curricula-a-collection-of-syllabi-3eedfb76be18, 2018.
20
D. J. Patil, A code of ethics for data science, https://medium.com/@dpatil/a-code-of-ethics-for-data-science-cda27d1fac1, 2018.
21
Association for Computing Machinery, ACM code of ethics and professional conduct, https://www.acm.org/code-of-ethics, 2018.
22
P. H. Collins, Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment. New York, NY, USA: Routledge, 2000.
23
A. Leftwich, Politics: People, resources, and power, in What is Politics? The Activity and its Study, A. Leftwich, ed. Oxford, UK: Basil Blackwell, 1984, pp. 62–84.
24
H. D. Lasswell, Politics: Who Gets What, When, How. New York, NY, USA: Whittlesey House, 1936.
25

N. Seaver, Algorithms as culture: Some tactics for the ethnography of algorithmic systems, Big Data&Society, vol. 4, no. 2, p. 205395171773810, 2017.

26

G. Neff, A. Tanweer, B. Fiore-Gartland, and L. Osburn, Critique and contribute: A practice-based framework for improving critical data studies and data science, Big Data, vol. 5, no. 2, pp. 85–97, 2017.

27
G. C. Bowker, S. L. Star, W. Turner, and L. Gasser, Social Science, Technical Systems, and Cooperative Work: Beyond the Great Divide. London, UK: Psychology Press, 1997.
28
J. Vincent, Drones taught to spot violent behavior in crowds using AI, https://www.theverge.com/2018/6/6/17433482/ai-automated-surveillance-drones-spot-violent-behavior-crowds, 2018.
29
D. Adjodah, AISG Panel at NeurIPS 2019. We have a lot to learn, https://medium.com/@_dval_/aisg-panel-at-neurips-2019-we-have-a-lot-to-learn-b69b573bd5af, 2019.
30
M. Hutson, Artificial intelligence could identify gang crimes—and ignite an ethical firestorm, https://www.sciencemag.org/news/2018/02/artificial-intelligence-could-identify-gang-crimes-and-ignite-ethical-firestorm, 2018.https://doi.org/10.1126/science.aat4510
DOI
31
L. Winner, The Whale And the Reactor: A Search For Limits in An Age of High Technology. Chicago, IL, USA: University of Chicago Press, 1986.
32
S. Jasanoff, In a constitutional moment: Science and social order at the millennium, in Social Studies of Science and Technology: Looking Back, Ahead, B. Joerges and H. Nowotny, eds. Dordrecht, the Netherland: Springer, 2003, p. 155–180.https://doi.org/10.1007/978-94-010-0185-4_8
DOI
33
B. Latour, Give me a laboratory and I will raise the world, in Science Observed: Perspectives on the Social Study of Science, K. Knorr-Cetina and M. J. Mulkay, eds. London, UK: Sage, 1983, pp. 141–170.
34

B. Joerges, Do politics have artefacts? Social Studies of Science, vol. 29, no. 3, pp. 411–431, 1999.

35

S. Woolgar and G. Cooper, Do artefacts have ambivalence: Moses’ bridges, Winner’s bridges and other urban legends in S&TS,Social Studies of Science, vol. 29, no. 3, pp. 433–449, 1999.

36
P. D. Norton, Fighting Traffic: The Dawn of the Motor Age in the American City. Cambridge, MA, USA: MIT Press, 2011.
37
B. Green, The Smart Enough City: Putting Technology in Its Place to Reclaim Our Urban Future. Cambridge, MA, USA: MIT Press, 2019.https://doi.org/10.7551/mitpress/11555.001.0001
DOI
38
S. Jasanoff, Technology as a site and object of politics, in The Oxford Handbook of Contextual Political Analysis, R. E. Goodin and C. Tilly, eds. Oxford, UK: Oxford University Press, 2006, pp. 745–763.
39
L. Daston and P. Galison, Objectivity. New York, NY, USA: Zone Books, 2007.
40
S. Harding, Is Science Multicultural?: Postcolonialisms, Feminisms, and Epistemologies. Bloomington, IN, USA: Indiana University Press, 1998.
41

D. Haraway, Situated knowledges: The science question in feminism and the privilege of partial perspective, Feminist Studies, vol. 14, no. 3, pp. 575–599, 1988.

42
R. M. Unger, False Necessity: Anti-Necessitarian Social Theory in the Service of Radical Democracy. Cambridge, UK: Cambridge University Press, 1987.
43
G. Lloyd, Maleness, metaphor, and the “crisis” of reason, in A Mind of One’s Own: Feminist Essays on Reason and Objectivity, L. M. Antony and C. E. Witt, eds. Boulder, CO, USA: Westview Press, 1993, pp. 73–89.https://doi.org/10.4324/9780429502682-5
DOI
44
E. F. Keller, Reflections on Gender and Science. New Heaven, CT, USA: Yale University Press, 1985.
45

C. A. MacKinnon, Feminism, marxism, method, and the state: An agenda for theory, Signs:Journal of Women in Culture and Society, vol. 7, no. 3, pp. 515–544, 1982.

46
P. Butler, Chokehold: Policing Black Men. New York, NY, USA: The New Press, 2017.
47
M. Alexander, The New Jim Crow: Mass Incarceration in the Age of Colorblindness. New York, NY, USA: The New Press, 2012.
48
D. Baum, Legalize it all, https://harpers.org/archive/2016/04/legalize-it-all/, 2016.
49
B. Meixell and R. Eisenbrey, An epidemic of wage theft is costing workers hundreds of millions of dollars a year, https://www.epi.org/publication/epidemic-wage-theft-costing-workers-hundreds/, 2014.
50
A. S. Vitale, The End of Policing. London, UK: Verso Books, 2017.
51
S. Hooker, Why “data for good” lacks precision, https://towardsdatascience.com/why-data-for-good-lacks-precision-87fb48e341f1, 2018.
52

R. Abebe and K. Goldner, Mechanism design for social good, AI Matters, vol. 4, no. 3, pp. 27–34, 2018.

53

B. Berendt, AI for the common good?! Pitfalls, challenges, and ethics pen-testing, Paladyn,Journal of Behavioral Robotics, vol. 10, no. 1, pp. 44–65, 2019.

54
W. Kenton, Social good, Investopedia, https://www.investopedia.com/terms/s/social_good.asp, 2021.
55
M. J. Bauman, K. S. Boxer, T. -Y. Lin, E. Salmon, H. Naveed, L. Haynes, J. Walsh, J. Helsby, S. Yoder, R. Sullivan, et al., Reducing incarceration through prioritized interventions, in Proc. the 1st ACM SIGCAS Conference on Computing and Sustainable Societies, Menlo Park and San Jose, CA, USA, 2018, pp. 1–8.https://doi.org/10.1145/3209811.3209869
DOI
56
S. Carton, J. Helsby, K. Joseph, A. Mahmud, Y. Park, J. Walsh, C. Cody, E. Patterson, L. Haynes, and R. Ghani, Identifying police officers at risk of adverse events, in Proc. the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 2016, pp. 67–76.https://doi.org/10.1145/2939672.2939698
DOI
57
S. Seo, H. Chan, P. J. Brantingham, J. Leap, P. Vayanos, M. Tambe, and Y. Liu, Partially generative neural networks for gang crime classification with partial information, in Proc. the 2018 AAAI/ACM Conference on AI, Ethics and Society (AIES), New Orleans, LA, USA, 2018, pp. 257–263.https://doi.org/10.1145/3278721.3278758
DOI
58
E. Felton, Gang databases are a life sentence for black and latino communities, https://psmag.com/social-justice/gang-databases-life-sentence-for-black-and-latino-communities, 2018.
59
D. Bloor, Knowledge and Social Imagery. Chicago, IL, USA: University of Chicago Press, 1991.
60
J. Sylvester and E. Raff, What about applied fairness? presented at Machine Learning: The Debates Workshop at the 35th International Conference on Machine Learning, Stockholm, Sweden, 2018.
61
A. Gorz, Strategy for Labor. Boston, MA, USA: Beacon Press, 1967.
62
A. Lorde, The master’s tools will never dismantle the master’s house, in Sister Outsider: Essays & Speeches. Trumansburg, NY, USA: Crossing Press, 1984, p. 110–113.
63

A. Karakatsanis, The punishment bureaucracy: How to think about “criminal justice reform”, The Yale Law Journal Forum, vol. 128, pp. 848–935, 2019.

64

A. M. McLeod, Confronting criminal law’s violence: The possibilities of unfinished alternatives, Unbound:Harvard Journal of the Legal Left, vol. 8, pp. 109–132, 2013.

65
B. Green, The false promise of risk assessments: Epistemic reform and the limits of fairness, in Proc. the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona, Spain, 2020, pp. 594–606.https://doi.org/10.1145/3351095.3372869
DOI
66
FWD. us, Broad, bipartisan support for bold pre-trial reforms in New York state, https://www.fwd.us/wp-content/uploads/2018/03/NYCJR-poll-memo-Final.pdf, 2018.
67
Data for Progress, Polling the left agenda, https://www.dataforprogress.org/polling-the-left-agenda/, 2018.
68
B. Green and S. Viljoen, Algorithmic realism: Expanding the boundaries of algorithmic thought, in Proc. the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona, Spain, 2020, pp. 19–31.https://doi.org/10.1145/3351095.3372840
DOI
69
70
J. Smith, ‘Minority report’ is real — and it’s really reporting minorities, https://mic.com/articles/127739/minority-reports-predictive-policing-technology-is-really-reporting-minorities, 2015.
71
ACM FAccT Conference, ACM FAccT network, https://facctconference.org/network/, 2021.
72

A. L. Hoffmann, Where fairness fails: Data, algorithms, and the limits of antidiscrimination discourse, Information,Communication&Society, vol. 22, no. 7, pp. 900–915, 2019.

73
A. D. Selbst, D. Boyd, S. A. Friedler, S. Venkatasubramanian, and J. Vertesi, Fairness and abstraction in sociotechnical systems, in Proc. the Conference on Fairness, Accountability, and Transparency, Atlanta, GA, USA, 2019, pp. 59–68.https://doi.org/10.1145/3287560.3287598
DOI
74
B. Green, Escaping the impossibility of fairness: From formal to substantive algorithmic fairness, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3883649, 2021.https://doi.org/10.2139/ssrn.3883649
DOI
75
A. Dunne and F. Raby, Design Noir: The Secret Life of Electronic Objects. Basle, Switzerland: Birkhauser, 2001.
76

T. Smyth and J. Dimond, Anti-oppressive design, Interactions, vol. 21, no. 6, pp. 68–71, 2014.

77
E. Graeff, The responsibility to not design and the need for citizen professionalism, Computing Professionals for Social Responsibility: The Past, Present and Future Values of Participatory Design, doi: 10.21428/93b2c832.c8387014https://doi.org/10.21428/93b2c832.c8387014
DOI
78

B. Green, T. Horel, and A. V. Papachristos, Modeling contagion through social networks to explain and predict gunshot violence in Chicago, 2006 to 2014, JAMA Internal Medicine, vol. 177, no. 3, pp. 326–333, 2017.

79

W. R. Frey, D. U. Patton, M. B. Gaskell, and K. A. McGregor, Artificial intelligence and inclusion: Formerly gang-involved youth as domain experts for analyzing unstructured twitter data, Social Science Computer Review, vol. 38, no. 1, pp. 42–56, 2020.

80

S. Goel, J. M. Rao, and R. Shroff, Precinct or prejudice? Understanding racial disparities in New York City’s stop-and-frisk policy The Annals of Applied Statistics, vol. 10, no. 1, pp. 365–394, 2016.

81

R. Voigt, N. P. Camp, V. Prabhakaran, W. L. Hamilton, R. C. Hetey, C. M. Griffiths, D. Jurgens, D. Jurafsky, and J. L. Eberhardt, Language from police body camera footage shows racial disparities in officer respect, Proceedings of the National Academy of Sciences, vol. 114, no. 25, pp. 6521–6526, 2017.

82

D. McQuillan, Data science as machinic neoplatonism, Philosophy&Technology, vol. 31, pp. 253–272, 2018.

83
S. M. West, M. Whittaker, and K. Crawford, Discriminating systems: Gender, race, and power in AI, https://ainowinstitute.org/discriminatingsystems.pdf, 2019.
84
A. L. Hoffmann, Data violence and how bad engineering choices can damage society, https://medium.com/s/story/data-violence-and-how-bad-engineering-choices-can-damage-society-39e44150e1d4, 2018.
85
R. Srinivasan, Whose Global Village?: Rethinking How Technology Shapes Our World. New York, NY, USA: NYU Press, 2017.
86

C. Harrington, S. Erete, and A. M. Piper, Deconstructing community-based collaborative design: Towards more equitable participatory design engagements, Proceedings of the ACM on Human-Computer Interaction, vol. 3, no. CSCW, pp. 1–25, 2019.

87
DOI
88
S. Costanza-Chock, Design Justice: Community-Led Practices to Build the Worlds We Need. Cambridge, MA, USA: MIT Press, 2020.https://doi.org/10.7551/mitpress/12255.001.0001
DOI
89
R. Benjamin, Race After Technology. Cambridge, UK: Polity, 2019.
90
C. D’Ignazio and L. F. Klein, Data Feminism. Cambridge, MA, USA: MIT Press, 2020.
91
M. Whittaker, M. Alper, C. L. Bennett, S. Hendren, L. Kaziunas, M. Mills, M. R. Morris, J. Rankin, E. Rogers, M. Salas, et al., Disability, bias, and AI, https://ainowinstitute.org/disabilitybiasai-2019.pdf, 2019.
92
Design Justice, Design Justice Network Principles, https://designjustice.org/read-the-principles, 2018.
93

A. Meng and C. DiSalvo, Grassroots resource mobilization through counter-data action, Big Data&Society, vol. 5, no. 2, p. 205395171879686, 2018.

94
S. Costanza-Chock, M. Wagoner, B. Taye, C. Rivas, C. Schweidler, G. Bullen, and the Tech for Social Justice Project, #More than code: Practitioners reimagine the landscape of technology for justice and equity, https://morethancode.cc/T4SJ_fullreport_082018_AY_web.pdf, 2018.
95
N. Scheiber and K. Conger, Uber and Lyft Drivers Gain Labor Clout, With Help From an App, https://www.nytimes.com/2019/09/20/business/uber-lyft-drivers.html, 2019.
96

J. Dickinson, M. Díaz, C. A. L. Dantec, and S. Erete, “The cavalry ain’t coming in to save us”: Supporting capacities and relationships through civic tech, Proceedings of the ACM on Human-Computer Interaction, vol. 3, no. CSCW, pp. 1–21, 2019.

97
J. N. Matias and M. Mou, CivilServant: Community-led experiments in platform governance, in Proc. the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, Canada, 2018, pp. 1–13.https://doi.org/10.1145/3173574.3173583
DOI
98

M. Asad, Prefigurative design as a method for research justice, Proceedings of the ACM on Human-Computer Interaction, vol. 3, no. CSCW, pp. 1–18, 2019.

99

M. M. Maharawal and E. McElroy, The anti-eviction mapping project: Counter mapping and oral history toward bay area housing justice, Annals of the American Association of Geographers, vol. 108, no. 2, pp. 380–389, 2018.

100
T. Lewis, S. P. Gangadharan, M. Saba, and T. Petty, Digital defense playbook: Community power tools for reclaiming data, Technical report, Our Data Bodies, Detroit, MI, USA, 2018.
101
S. Viljoen, The promise and limits of lawfulness: Inequality, law, and the techlash, Journal of Social Computing, doi: 10.23919/JSC.2021.0025.
102
P. E. Agre, Toward a critical technical practice: Lessons learned in trying to reform AI, in Social Science, Technical Systems, and Cooperative Work: Beyond the Great Divide, G. C. Bowker, S. L. Star, W. Turner, and L. Gasser, eds. London, UK: Psychology Press, 1997, pp. 131–158.
103
B. Hecht, L. Wilcox, J. P. Bigham, J. Schöning, E. Hoque, J. Ernst, Y. Bisk, L. D. Russis, L. Yarosh, B. Anjam, et al., It’s time to do something: Mitigating the negative impacts of computing through a change to the peer review process, ACM Future of Computing Blog, https://acm-fca.org/2018/03/29/negativeimpacts/, 2018.
104
Neural Information Processing Systems Conference, Getting started with NeurIPS 2020, https://medium.com/@NeurIPSConf/getting-started-with-neurips-2020-e350f9b39c28, 2020.
105
E. Fee, Women’s nature and scientific objectivity, in Woman’s Nature: Rationalizations of Inequality, M. Lowe and R. Hubbard, eds. New York, Ny, USA: Pergamon Press, 1983, pp. 9–27.
106
C. Pein, Blame the computer, https://thebaffler.com/salvos/blame-the-computer-pein, 2018.
107

S. Viljoen, A relational theory of data governance, Yale Law Journal, vol. 131, no. 2, pp. 573–654, 2021.

108
Mijente, 1, 200+ students at 17 universities launch campaign targeting Palantir, https://notechforice.com/20190916-2/, 2019.https://doi.org/10.1002/npcr.31290
DOI
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 20 May 2021
Revised: 19 November 2021
Accepted: 25 November 2021
Published: 13 January 2022
Issue date: September 2021

Copyright

© The author(s) 2021

Acknowledgements

Acknowledgment

B. Green is grateful to the Berkman Klein Center Ethical Tech Working Group for fostering his thinking on matters of technology, ethics, and politics. B. Green also thanks Catherine D’Ignazio, Anna Lauren Hoffmann, Lily Hu, Momin Malik, Dan McQuillan, Luke Stark, Salomé Viljoen, and the reviewers for providing helpful discussions and suggestions.

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return