Journal Home > Volume 2 , Issue 4

Tech critics become technocrats when they overlook the daunting administrative density of a digital-first society. The author implores critics to reject structural dependencies on digital tools rather than naturalize their integration through critique and reform. At stake is the degree to which citizens must defer to unelected experts to navigate such density. Democracy dies in the darkness of sysadmin. The argument and a candidate solution proceed as follows. Since entropy is intrinsic to all physical systems, including digital systems, perfect automation is a fiction. Concealing this fiction, however, are five historical forces usually treated in isolation: ghost work, technical debt, intellectual debt, the labor of algorithmic critique, and various types of participatory labor. The author connects these topics to emphasize the systemic impositions of digital decision tools, which compound entangled genealogies of oppression and temporal attrition. In search of a harmonious balance between the use of “AI” tools and the non-digital decision systems they are meant to supplant, the author draws inspiration from an unexpected source: musical notation. Just as musical notes require silence to be operative, the author positions algorithmic silence—the deliberate exclusion of highly abstract digital decision systems from human decision-making environments—as a strategic corrective to the fiction of total automation. Facial recognition bans and the Right to Disconnect are recent examples of algorithmic silence as an active trend.


menu
Abstract
Full text
Outline
About this article

Algorithmic Silence: A Call to Decomputerize

Show Author's information Jonnie Penn1( )
Department of History and Philosophy of Science, University of Cambridge, Cambridge, CB2 3RH, UK, and also with the Berkman Klein Center, Harvard University, Cambridge, MA 02138, USA

Abstract

Tech critics become technocrats when they overlook the daunting administrative density of a digital-first society. The author implores critics to reject structural dependencies on digital tools rather than naturalize their integration through critique and reform. At stake is the degree to which citizens must defer to unelected experts to navigate such density. Democracy dies in the darkness of sysadmin. The argument and a candidate solution proceed as follows. Since entropy is intrinsic to all physical systems, including digital systems, perfect automation is a fiction. Concealing this fiction, however, are five historical forces usually treated in isolation: ghost work, technical debt, intellectual debt, the labor of algorithmic critique, and various types of participatory labor. The author connects these topics to emphasize the systemic impositions of digital decision tools, which compound entangled genealogies of oppression and temporal attrition. In search of a harmonious balance between the use of “AI” tools and the non-digital decision systems they are meant to supplant, the author draws inspiration from an unexpected source: musical notation. Just as musical notes require silence to be operative, the author positions algorithmic silence—the deliberate exclusion of highly abstract digital decision systems from human decision-making environments—as a strategic corrective to the fiction of total automation. Facial recognition bans and the Right to Disconnect are recent examples of algorithmic silence as an active trend.

Keywords:

technocracy, algorithmic silence, history, labor, artificial intelligence, AI ethics, automation, decomputerization
Received: 20 May 2021 Revised: 23 November 2021 Accepted: 25 November 2021 Published: 30 January 2022 Issue date: December 2021
References(152)
1
Y. Katz, Manufacturing an artificial intelligence revolution, SSRN Electronic Journal, doi: 10.2139/ssrn.3078224.https://doi.org/10.2139/ssrn.3078224
DOI
2
J. S. Brennan, A. Schulz, P. N. Howard, and R. K. Nielsen, Industry, experts, or industry experts? Academic sourcing in news coverage of AI’, Reuters Institute for the Study of Journalism, https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2019-12/Brennen_Academic_Sourcing_in_News_Coverage_of_AI_FINAL.pdf, 2019.
3
J. Penn, Inventingintelligence: On the history of complex information processing and artificial intelligence in the United States in the Mid-Twentieth Century, PhD Dissertation, University of Cambridge, Cambridge, UK, 2020.
4
M. I. Ganesh and S. Lohmüller, #5 Spectres of AI, Spheres: Journal for Digital Cultures, https://spheres-journal.org/contribution/5-spectres-of-ai/, 2019.
5

S. Cave and K. Dihal, Ancient dreams of intelligent machines: 3, 000 years of robots, Nature, vol. 559, no. 7715, pp. 473–475, 2018.

6
S. Cave, C. Craig, K. Dihal, S. Dillon, J. Montgomery, B. Singler, and L. Taylor, Portrayals and perceptions of AI and why they matter, The Royal Society, https://royalsociety.org/-/media/policy/projects/ai-narratives/AI-narratives-workshop-findings.pdf, 2018.
7
S. Cave, K. Dihal, and S. Dillon, eds., AI Narratives: A History of Imaginative Thinking About Intelligent Machines. New York, NY, USA: Oxford University Press, 2020.https://doi.org/10.1093/oso/9780198846666.001.0001
DOI
8

J. Dinerstein, Technology and its discontents: On the verge of the posthuman, American Quarterly, vol. 58, no. 3, pp. 569–595, 2006.

9
P. Mirowski, Machine Dreams: Economics Becomes a Cyborg Science. Cambridge, UK: Cambridge University Press, 2002.https://doi.org/10.1017/CBO9780511613364
DOI
10
R. Scott, Blade Runner, Movie, Jun. 25, 1982.
11
M. Hicks, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing. Cambridge, MA, USA: MIT Press, 2017.
12
J. Penn, Programmed inequality: How Britain discarded women technologists and lost its edge in computing, H-Sci-Med-Tech, no. H-Net Reviews, http://www.h-net.org/reviews/showrev.php?id=52804, 2019.
13
C. Hauskeller and C. Hick, Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life. Durham, UK: Duke University Press, 2019.https://doi.org/10.1080/01419870.2020.1727936
DOI
14
R. Benjamin, Race After Technology: Abolitionist Tools For The New Jim Code. Cambridge, UK: Polity, 2019.https://doi.org/10.1093/sf/soz162
DOI
15
V. Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish The Poor, First Edition. New York, NY, USA: St. Martin’s Press, 2017.
16
M. Broussard, Artificial Unintelligence: How Computers Misunderstand The World. Cambridge, MA, USA: The MIT Press, 2018.https://doi.org/10.7551/mitpress/11022.001.0001
DOI
17
C. D’Ignazio and L. F. Klein, Data Feminism. Cambridge, MA, USA: The MIT Press, 2020.
18

J. Buolamwini and T. Gebru, Gender shades: Intersectional accuracy disparities in commercial gender classification, Proceedings of the 1st Conference on Fairness,Accountability and Transparency, vol. 81, pp. 77–91, 2018.

19

O. Keyes, The misgendering machines: Trans/HCI implications of automatic gender recognition, Proceedings of the ACM on Human-Computer Interaction, vol. 2, no. 88, pp. 1–22, 2018.

20
K. M. Miltner, Girls who coded: Gender in twentieth century U. K. and U. S. computing, Science, Technology, & Human Values, doi: 10.1177/0162243918770287.https://doi.org/10.1177/0162243918770287
DOI
21

S. M. Ali, A brief introduction to decolonial computing, XRDS:Crossroads,The ACM Magazine for Students, vol. 22, no. 4, pp. 16–21, 2016.

22
B. Green, The contestation of tech ethics: A sociotechnical approach to technology ethics in practice, Journal of Social Computing, doi: 10.23919/JSC.2021.0018.https://doi.org/10.23919/JSC.2021.0018
DOI
23

D. McDermott, Artificial intelligence meets natural stupidity, ACM SIGART Bulletin, no. 57, pp. 4–9, 1976.

24
M. M. Malik, A hierarchy of limitations in machine learning: Data biases and the social sciences, presented at the Ministry of Science, Innovation, Technology and University of Spain, Virtual, 2020.
25

Z. C. Lipton and J. Steinhardt, Troubling trends in machine learning scholarship, Queue, vol. 17, no. 1, pp. 45–77, 2019.

26
S. C. Garvey, The “general problem solver” doesn’t exist: Mortimer taube & the art of AI criticism, presented at the Society for the History of Technology, St. Louis, MO, USA, 2018.
27
H. L. Dreyfus, What Computers Can’t Do: A Critique of Artificial Reason, 1st ed. New York, NY, USA: Harper & Row, 1972.
28
H. L. Dreyfus, What Computers Still Can’t Do: A Critique of Artificial Reason. Cambridge, MA, USA: MIT Press, 1992.
29
H. L. Dreyfus, Why heideggerian AI failed and how fixing it would require making it more heideggeria, in The Mechanical Mind in History, P. Husbands, O. Holland, and M. Wheeler, eds. Cambridge, MA, USA: MIT Press, 2008, pp. 331–372.https://doi.org/10.7551/mitpress/9780262083775.003.0014
DOI
30
A. Agrawal, J. Gans, and A. Goldfarb, Prediction Machines: The Simple Economics of Artificial Intelligence. Boston, MA, USA: Harvard Business Review Press, 2018.
31
D. Gayo-Avello, “I wanted to predict elections with Twitter and all I got was this lousy paper” — A balanced survey on election prediction using Twitter data, arXiv preprint arXiv: 1204.6441, 2019.
32
M. Mitchell, B. Agüera y Arca, and A. Todorov, Do algorithms reveal sexual orientation or just expose our stereotypes? https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-or-just-expose-our-stereotypes-d998fafdf477, 2019.
33
34
J. C. Scott, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New Haven, CT, USA: Yale University Press, 2008.
35
G. C. Bowker and S. L. Star, Sorting Things Out: Classification and Its Consequences. Cambridge, MA, USA: MIT Press, 1999.https://doi.org/10.7551/mitpress/6352.001.0001
DOI
36
N. Ensmenger, Chess, cars, and cognition: How problem choice shapes a discipline, Pavilion Room, Hughes Hall, Cambridge, http://lcfi.ac.uk/events/origin-myths-artificial-intelligence-histories-tec/, 2018.
37
K. Crawford, R. Dobbe, T. Dryer, G. Fried, B. Green, E. Kaziunas, A. Kak, V. Mathur, E. McElroy, A. N. Sánchez, et al., AI now 2019 report, AI Now Institute, New York, https://ainowinstitute.org/AI_Now_2019_Report.html, 2019.
38
E. Antonova, Y. Chaudhary, S. Dick, and S. McDonald, Next leaders: Bold visions for the future of AI, presented at the CogX 2019, London, https://www.youtube.com/watch?v=9ifNuQCdvJI, 2019.
39

S. Schaffer, Babbage’s intelligence: Calculating engines and the factory system, Critical Inquiry, vol. 21, no. 1, pp. 203–227, 1994.

40

M. W. Marshall, “Automation” today and in 1662, American Speech, vol. 32, no. 2, pp. 149–151, 1957.

41
A. Taylor, The automation charade, Logic Magazine, https://logicmag.io/05-the-automation-charade/, 2018.
42
M. L. Gray and S. Suri, Ghost Work: How to Stop Silicon Valley From Building A New Global Underclass. Boston, MA, USA: Houghton Mifflin Harcourt, 2019.
43
D. Weil, The Fissured Workplace: Why Work Became so Bad For So Many And What Can Be Done To Improve It. Cambridge, MA, USA: Harvard University Press, 2014.https://doi.org/10.4159/9780674726123
DOI
44
J. Woodcock and M. Graham, The Gig Economy: A Critical Introduction. Cambridge, UK: Polity, 2020.
45
J. Koebler and J. Cox, The impossible job: Inside Facebook’s struggle to moderate two billion people, Motherboard Vice, https://perma.cc/389C-Y2FN, 2019.
46
2018 employer information report, consolidated report - Type 2, Facebook, Inc., 1 Hacker Way, CW14861, https://diversity.fb.com/wp-content/images/2018-Consolidated-EEO-1-Part-1.pdf, 2019.
47
C. Newton, The trauma floor: The secret lives of Facebook moderators in America, The Verge, https://perma.cc/SJ3V-N6PP, 2019.
48
P. Olson, Image-recognition technology may not be as secure as we think, The Wall Street Journal, https://perma.cc/Z9T2-2S5Z, 2019.
49
LinkedIn: YouTube LLC, https://www.linkedin.com/company/youtube/about/, 2019.
50
S. Levin, Google to hire thousands of moderators after outcry over YouTube abuse videos, The Guardian, Dec. 05, 2017.
51
Number of monthly logged-in YouTube viewers worldwide as of May 2019 (in billions), Statistica. com, https://www.statista.com/statistics/859829/logged-in-youtube-viewers-worldwide/, 2019.
52
53
Neevo. ai, https://www.neevo.ai/, 2019.
54
Economic news release; Table B-1. Employees on nonfarm payrolls by industry sector and selected industry detail, Bureau of Labor Statistics, https://www.bls.gov/news.release/empsit.t17.htm, 2019.
55
Data engineering, preparation, and labeling for AI 2019, Cognilytica, https://www.cognilytica.com/2019/03/06/report-data-engineering-preparation-and-labeling-for-ai-2019/, 2019.
56
M. Murgia, AI’s new workforce: The data-labelling industry spreads globally, Financial Times, https://www.ft.com/content/56dde36c-aa40-11e9-984c-fac8325aaa04, 2019.
57
Data-labelling startups want to help improve corporate AI, The Economist, https://www.economist.com/business/2019/10/17/data-labelling-startups-want-to-help-improve-corporate-ai, 2019.
58
A. Sage, California Senate passes bill to tighten “gig” worker rule, Reuters, San Francisco, https://www.reuters.com/article/us-employment-california/california-senate-passes-bill-to-tighten-gig-worker-rule-idUSKCN1VW0M7, 2020.
59
E. Torpey, You’re a what? Social media specialist, Career Outlook, Bureau of Labor Statistics, https://www.bls.gov/careeroutlook/2016/youre-a-what/social-media-specialist.htm?view_full, 2016.
60
61
S. T. Roberts, Behind The Screen: Content Moderation in the Shadows of Social Media. New Haven, CT, USA: Yale University Press, 2019.https://doi.org/10.12987/9780300245318
DOI
62
S. C. Kuek, Paradi-Guilford, Cecilia Maria, T. Fayomi, S. Imaizumi, and P. Ipeirotis, The Global Opportunity in Online Outsourcing (English), Washington, D. C. : World Bank Group, http://documents.worldbank.org/curated/en/138371468000900555/The-global-opportunity-in-online-outsourcing, 2015.
63
W. Cunningham, The WyCash portfolio management system, Addendum to the proceedings on Object-oriented programming systems, languages, and applications (Addendum) - OOPSLA ’92, doi: 10.1145/157709.157715.https://doi.org/10.1145/157709.157715
DOI
64
A. Chatzigeorgiou, A. Ampatzoglou, A. Ampatzoglou, and T. Amanatidis, Estimating the breaking point for technical debt, in Proc. 2015 IEEE 7th International Workshop on Managing Technical Debt (MTD), Bremen, Germany, 2015, pp. 53–56.https://doi.org/10.1109/MTD.2015.7332625
DOI
65

Z. Li, P. Avgeriou, and P. Liang, A systematic mapping study on technical debt and its management, Journal of Systems and Software, vol. 101, pp. 193–220, 2015.

66

C. Seaman and Y. Guo, Measuring and monitoring technical debt, Advances in Computers, vol. 82, pp. 25–46, 2011.

67
B. Curtis, J. Sappidi, and A. Szynkarski, Estimating the size, cost, and types of Technical Debt, presented at 2012 Third International Workshop on Managing Technical Debt (MTD), Zurich, Switzerland, 2012.https://doi.org/10.1109/MTD.2012.6226000
DOI
68
N. Brown, Y. Cai, Y. Guo, R. Kazman, M. Kim, P. Kruchten, E. Lim, A. MacCormack, R. Nord, I. Ozkaya, et al., Managing technical debt in software-reliant systems, in Proc. the FSE/SDP workshop on Future of software engineering research - FoSER’10, Santa Fe, NM, USA, 2010, pp. 47–51.https://doi.org/10.1145/1882362.1882373
DOI
69

P. Kruchten, R. L. Nord, and I. Ozkaya, Technical debt: From metaphor to theory and practice, IEEE Software, vol. 29, no. 6, pp. 18–21, 2012.

70
D. Sculley, G. Holt, D. Golovin, E. Davydov, T. Phillips, D. Ebner, V. Chaudhary, M. Young, J. Crespo, and D. Dennison, Hidden technical debt in machine learning systems, Advances in Neural Information Processing Systems, https://proceedings.neurips.cc/paper/2015/file/86df7dcfd896fcaf2674f757a2463eba-Paper.pdf, 2015.
71
H. Foidl, M. Felderer, and S. Biffl, Technical debt in data-intensive software systems, arXiv preprint arXiv: 1905.13455, 2019.https://doi.org/10.1109/SEAA.2019.00058
DOI
72
Protocol Service of the European Commission, http://ec.europa.eu/dgs/secretariat_general/corps/index.cfm?go=protocol.protoco, 2019.
73

S. Applin, Autonomous vehicle ethics: Stock or custom? IEEE Consumer Electronics Magazine, vol. 6, no. 3, pp. 108–110, 2017.

74
J. Zittrain, The hidden costs of automated thinking, The New Yorker, https://www.newyorker.com/tech/annals-of-technology/the-hidden-costs-of-automated-thinking, 2019.
75
C. Anderson, The end of theory: The data deluge makes the scientific method obsolete, Wired, https://www.wired.com/2008/06/pb-theory/, 2008.
76
D. Weinberger, Our machines now have knowledge we’ll never understand, Wired, https://www.wired.com/story/our-machines-now-have-knowledge-well-never-understand/, 2017.
77

M. Kochen, How well do we acknowledge intellectual debts? Journal of Documentation, vol. 43, no. 1, pp. 54–64, 1987.

78

M. L. Jones, How we became instrumentalists (again): Data positivism since World War II’, Historical Studies in the Natural Sciences, vol. 48, no. 5, pp. 673–684, 2018.

79
B. Green, Smart Enough City: Putting Technology in Its Place to Reclaim Our Urban Future. Boston, MA, USA: MIT Press, 2019.https://doi.org/10.7551/mitpress/11555.001.0001
DOI
80
Human-Centered Artificial Intelligence, Stanford University, 2019 AI index: Ground the conversation about AI in data, https://aiindex.org/, 2019.
81

L. Floridi, Translating principles into practices of digital ethics: Five risks of being unethical, Philosophy&Technology, vol. 32, no. 2, pp. 185–193, 2019.

DOI
82
J. Fjeld, H. Hilligoss, N. Achten, and A. Nagy, Principled artificial intelligence: Mapping consensus in ethical and rights-based approaches to principles for AI, Berkman Klein Center Research Publication, vol. 2020, no. 1, p. 39, 2020..https://doi.org/10.2139/ssrn.3518482
DOI
83

D. S. Murray, The precarious new faculty majority: Communication and instruction research and contingent labor in higher education, Communication Education, vol. 68, no. 2, pp. 235–245, 2019.

84
85
U. Huws, Labor in The Global Digital Economy: The Cybertariat Comes of Age. New York, NY, USA: Monthly Review Press, 2014.
86

L. Nakamura, Don’t hate the player, hate the game: The racialization of labor in World of Warcraft’, Critical Studies in Media Communication, vol. 26, no. 2, pp. 128–144, 2009.

87

L. Nakamura, The unwanted labour of social media: Women of colour call out culture as venture community management, New Formations, vol. 86, no. 86, pp. 106–112, 2015.

88
S. A. Applin and M. D. Fischer, New technologies and mixed-use convergence: How humans and algorithms are adapting to each other, presented at 2015 IEEE International Symposium on Technology and Society (ISTAS), Dublin, Ireland, 2015.https://doi.org/10.1109/ISTAS.2015.7439436
DOI
89
A. Darlo and H. Brignull, Dark patterns, https://www.darkpatterns.org/, 2020.
90
M. C. Elish, Moral crumple zones: Cautionary tales in human-robot interaction (WeRobot 2016), SSRN Electronic Journal, doi: 10.2139/ssrn.2757236.https://doi.org/10.2139/ssrn.2757236
DOI
91
A. Mateescu and M. C. Elish, AI in context: The labor of integrating new technologie, https://datasociety.net/wp-content/uploads/2019/01/DataandSociety_AIinContext.pdf, 2019.
92
N. Couldry and U. A. Mejias, The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism. Redwood City, CA, USA: Stanford University Press, 2019.https://doi.org/10.1515/9781503609754
DOI
93
R. S. Cowan, More Work for Mother: The Ironies of Household; Technology from the Open Hearth to the Microwave, Nachdr. New York, NY, USA: Basic Books, 2011.
94
P. Peña and J. Varon, Consent to our Data Bodies: Lessons from feminist theories to enforce data protection, https://codingrights.org/docs/ConsentToOurDataBodies.pdf, 2019.
95
M. Pasquinelli and V. Joler, The nooscope manifested: Artificial intelligence as instrument of knowledge extractivism, KIM HfG Karlsruhe and Share Lab, http://nooscope.ai, 2020.https://doi.org/10.1007/s00146-020-01097-6
DOI
96
K. Crawford and V. Joler, Anatomy of an AI system: The Amazon Echo as an anatomical map of human labor, data and planetary resources, AI Now Institute and Share Lab, https://anatomyof.ai, 2018.
97

S. Dick and D. Volmar, DLL hell: Software dependencies, failure, and the maintenance of Microsoft Windows, IEEE Annals Hist. Comput., vol. 40, no. 4, pp. 28–51, 2018.

98
M. V. Wilkes, Memoirs of A Computer Pioneer. Cambridge, MA, USA: MIT Press, 1985.
99
N. Ensmenger, The Computer Boys Take Over: Computers, Programmers, and the Politics of Tecnical Expertise. Cambridge, MA, USA: MIT Press, 2010.https://doi.org/10.7551/mitpress/9780262050937.001.0001
DOI
100
Employment by detailed occupation, U. S. Bureau of Labor Statistics | Office of Occupational Statistics and Employment Projections, Washington, DC, Table 1.2 Employment by detailed occupation, 2016 and projected 2026 (Numbers in thousands), https://www.bls.gov/emp/tables/emp-by-detailed-occupation.htm, 2019.
101
P. G. Neumann, The risks digest: Forum on risks to the public in computers and related systems, http://catless.ncl.ac.uk/Risks/, 2019.
102
T. Huckle, Collection of software bugs, https://www5.in.tum.de/~huckle/bugse.html, 2019.
103
SIGCIS 2019 - Exception error: Fatal, illegal, unknown, https://www.sigcis.org/node/621, 2019.
104
M. A. Makary and M. Daniel, Medical error—the third leading cause of death in the US, BMJ, doi: 10.1136/bmj.i2139.https://doi.org/10.1136/bmj.i2139
DOI
105
K. G. Shojania and M. Dixon-Woods, Estimating deaths due to medical error: The ongoing controversy and why it matters: Table 1’, BMJ Quality & Safety, doi: 10.1136/bmjqs-2016-006144.https://doi.org/10.1136/bmjqs-2016-006144
DOI
106
M. Thomas and H. Thimbleb, Computer bugs in hospitals: A new killer−professor Martyn Thomas CBE and professor Harold Thimbleb, Gresham College, London, https://www.youtube.com/watch?v=seyaYL2ou14, 2018.
107
M. Thomas and H. Thimbleb, Computer bugs in hospitals: An unnoticed killer, http://www.harold.thimbleby.net/killer.pdf, 2019.
108
National Institute of Standards and Technology, Program Office Strategic Planning and Economic Analysis Group, The economic impacts of inadequate infrastructure for software testing, https://www.nist.gov/sites/default/files/documents/director/planning/report02-3.pdf, 2002.
109
J. Williams, Stand Out of Our Light: Freedom and Resistance in the Attention Economy. Cambridge, UK: Cambridge University Press, 2018.https://doi.org/10.1017/9781108453004
DOI
110
111
B. Cooper, The racial politics of time, presented at the TEDWomen, https://www.ted.com/talks/brittney_cooper_the_racial_politics_of_time/transcript, 2016.
112
J. Halberstam, In A Queer Time and Place: Transgender Bodies, Subcultural Lives. New York, NY, USA: New York University Press, 2005.
113
Western States Center, Dismantling Racism: A Resource Book for Social Change Groups, Anti-Racism Digital Library, https://sacred.omeka.net/items/show/221, 2022.
114
J. Von Neumann, Can we survive technology? Fortune, no. 15, Jun. 1955.
115

M. Minsky, Steps toward artificial intelligence, Proceedings of the IRE, vol. 49, no. 1, pp. 8–30, 1961.

116

H. T. Larson, The computer issue, Proceedings of the IRE, vol. 49, no. 1, pp. 4–7, 1961.

117

L. Daston, Enlightenment calculations, Critical Inquiry, vol. 21, no. 1, pp. 182–202, 1994.

118
M. D. Sahlins, Stone Age Economics. New Brunswick, NJ, USA: Transaction Publishers, 2011.https://doi.org/10.4324/9780203715178
DOI
119
U. A. Mejias, Off the Network: Disrupting the Digital World. Minneapolis, MN, USA: University of Minnesota Press, 2013.https://doi.org/10.5749/minnesota/9780816678990.001.0001
DOI
120

Z. Lissa, Aesthetic functions of silence and rests in music, The Journal of Aesthetics and Art Criticism, vol. 22, no. 4, p. 443, 1964.

121

E. Nelson, Walking to the future in the steps of our ancestors: Haudenosaunee traditional ecological knowledge and queer time in the climate change era, New Geographies, vol. 09, no. Posthuman, pp. 133–138, 2017.

122
A. L. Stoler, Imperial Debris: On Ruins and Ruination. Durham, UK: Duke University Press, 2013.https://doi.org/10.1515/9780822395850
DOI
123

M. Liboiron, M. Tironi, and N. Calvillo, Toxic politics: Acting in a permanently polluted world, Social Studies of Science, vol. 48, no. 3, pp. 331–349, 2018.

124
M. Richtel and T. Shanker, In new military, data overload can be deadly, The New York Times, https://www.nytimes.com/2011/01/17/technology/17brain.html, 2011.
125
A. Young, Too much information: Ineffective intelligence collection, Harvard International Review, Aug. 18, 2019.
126
M. G. D. C. Katoch, Leadership challenges in the information age, Journal of the United Service Institution of India, https://usiofindia.org/publication/usi-journal/leadership-challenges-in-the-information-age/, 2015.
127
M. J. L. Campo, Information dominance or information overload? Unintended consequences of “every soldier and platform a sensor”, Naval War College, https://apps.dtic.mil/dtic/tr/fulltext/u2/a525256.pdf, 2010.
128
T. Dryer, Designing certainty: The rise of algorithmic computing in an age of anxiety 1920−1970, https://escholarship.org/uc/item/4d02g6x3, 2019.
129

J. Stilgoe, R. Owen, and P. Macnaghten, Developing a framework for responsible innovation, Research Policy, vol. 42, no. 9, pp. 1568–1580, 2013.

130
E. Gordon and G. Mugar, Meaningful Inefficiencies: Designing for Public Value in an Age of Digital Expediency. New York, NY, USA: Oxford University Press, 2020.https://doi.org/10.1093/oso/9780190870140.001.0001
DOI
131
L. Downey and J. Simons, Charismatic moments, in A Political Economy and Justice, D. Allen, Y. Benkler, and R. Henderson, eds. Chicago, IL, USA: Chicago University Press, 2022.
132
C. Lanius, Fact check: Your demand for statistical proof is racist, Cyborgology, https://thesocietypages.org/cyborgology/2015/01/12/fact-check-your-demand-for-statistical-proof-is-racist/, 2015.
133
K. Hill, Imagine being on trial. With exonerating evidence trapped on your phone. The New York Times, https://www.nytimes.com/2019/11/22/business/law-enforcement-public-defender-technology-gap.html?auth=login-email&login=email, 2019.
134
D. Harwell, AI will solve Facebook’s most vexing problems, Mark Zuckerberg says. Just don’t ask when or how, The Washington Post, https://www.washingtonpost.com/news/the-switch/wp/2018/04/11/ai-will-solve-facebooks-most-vexing-problems-mark-zuckerberg-says-just-dont-ask-when-or-how/?utm_term=.8c71a2d658f6, 2018.
135
R. A. Oppel Jr. and J. K. Patel, Onelawyer, 194 felony cases, and no time, The New York Times, https://www.nytimes.com/interactive/2019/01/31/us/public-defender-case-loads.html, 2019.
136
E. J. Topol, Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again, First edition. New York, NY, USA: Basic Books, 2019.
137

I. Joshi, Waiting for deep medicine, The Lancet, vol. 393, no. 10177, pp. 1193–1194, 2019.

138
Z. Tufekci, Why Zuckerberg’s 14-year apology tour hasn’t fixed Facebook, Wired, https://www.wired.com/story/why-zuckerberg-15-year-apology-tour-hasnt-fixed-facebook/, 2018.
139
C. Garvey, Broken promises and empty threats: The evolution of AI in the USA, 1956−1996, Tech’s Stories, doi: 10.15763/jou.ts.2018.03.16.02.https://doi.org/10.15763/jou.ts.2018.03.16.02
DOI
140
D. C. Brock, Our censors, ourselves: Commercial content moderation, Los Angeles Review of Books, https://lareviewofbooks.org/article/our-censors-ourselves-commercial-content-moderation/, 2019.
141

K. Cascone, The aesthetics of failure: “Post-digital” tendencies in contemporary computer music, Computer Music Journal, vol. 24, no. 4, pp. 12–18, 2000.

142
S. M. Ali, Fugitive decolonial luddism – A hauntology, Presentation, Digital Ethics Lab, Oxford Internet Institute, Dec. 10, 2019.
143

D. Ribes, A. S. Hoffman, S. C. Slota, and G. C. Bowker, The logic of domains, Social Studies of Science, vol. 49, no. 3, pp. 281–309, 2019.

144
S. Kelkar, Reinventing expertise in the age of platforms: The case of data science, presented at the BIDS Data Science, https://www.youtube.com/watch?v=-Ba2Gq13dBI, 2019.
145
D. Forsythe and D. J. Hess, Studying Those Who Study Us: An Anthropologist in the World of Artificial Intelligence. Redwood City, CA, USA: Stanford University Press, 2001.https://doi.org/10.1515/9781503619371
DOI
146
S. Viljoen, The promise and limits of lawfulness: Inequality, law, and the techlash, Journal of Social Computing, doi: 10.23919/JSC.2021.0025.https://doi.org/10.23919/JSC.2021.0025
DOI
147
E. Tuck and M. McKenzie, Place in Research: Theory, Methodology, and Methods. New York, NY, USA: Routledge, 2016.
148

E. Tuck and M. McKenzie, Relational validity and the “where” of inquiry: Place and land in qualitative research, Qualitative Inquiry, vol. 21, no. 7, pp. 633–638, 2015.

149
E. Tuck and K. W. Yang, Decolonization is not a metaphor, Decolonization: Indigeneity, Education & Society, vol. 1, no. 1, pp. 1–40, 2012.
150
A. C. Clarke, Profiles of The Future: An Inquiry into the Limits of the Possible. London, UK: Indigo, 2000.
151
E. Strubell, A. Ganesh, and A. McCallum, Energy and policy considerations for deep learning in NLP, arXiv preprint arXiv: 1906.02243, 2019.https://doi.org/10.18653/v1/P19-1355
DOI
152
B. Tarnoff, To decarbonize we must decomputerize: why we need a Luddite revolution, The Guardian, https://www.theguardian.com/technology/2019/sep/17/tech-climate-change-luddites-data, 2019.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 20 May 2021
Revised: 23 November 2021
Accepted: 25 November 2021
Published: 30 January 2022
Issue date: December 2021

Copyright

© The author(s) 2021

Acknowledgements

Acknowledgment

Special thanks to Sarah T. Hamid, Sarah Dillon, Stephanie Dick, Richard Staley, Helen Anne Curry, Momin M. Malik, Mustafa Ali, Mary Gray, Sean McDonald, William Lazonick, Ernesto Oyarbide-Magaña, Ben Green, and attendees of the 2020 Istanbul Privacy Symposium.

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return