Journal Home > Volume 2 , Issue 3

What are the consequences of the language we use for technology, and, how we describe the frameworks regarding technology and its creation, use, and deployment? The language used to describe technology has the possibility to deceive and be abusive. How language is used demonstrates what can occur when one party is able to assert linguistic power over another. The way in which organizations frame their relationships with technology is one such power asymmetry. This article examines the complications of the imagery used for ethics in technology. Then, the author offers a brief overview of how language influences our perceptions. The frames used to describe phenomena, including ethical frameworks and technology, allow for the creation of heuristics, or shortcuts that are “good enough” for understanding what is being described and for decision-making. Therefore, descriptions matter for relaying meaning and constructing narratives related to ethical uses of technical systems. After this, the author investigates what we mean by ethics and the codes that corporate, governmental, and other organizations use to depict how they understand their relationship to the technology they create and deploy. The author explores three examples of frames of ethics and descriptions of technology, which though appearing progressive, once understood holistically, fail to adequately describe technology and its possible impact. The author ends this article with a discussion of the complexity of describing and communicating ethical uses of technology.


menu
Abstract
Full text
Outline
About this article

Framing and Language of Ethics: Technology, Persuasion, and Cultural Context

Show Author's information Jasmine E. McNealy1( )
College of Journalism and Communications, University of Florida, Gainesville, FL 32601, USA

Abstract

What are the consequences of the language we use for technology, and, how we describe the frameworks regarding technology and its creation, use, and deployment? The language used to describe technology has the possibility to deceive and be abusive. How language is used demonstrates what can occur when one party is able to assert linguistic power over another. The way in which organizations frame their relationships with technology is one such power asymmetry. This article examines the complications of the imagery used for ethics in technology. Then, the author offers a brief overview of how language influences our perceptions. The frames used to describe phenomena, including ethical frameworks and technology, allow for the creation of heuristics, or shortcuts that are “good enough” for understanding what is being described and for decision-making. Therefore, descriptions matter for relaying meaning and constructing narratives related to ethical uses of technical systems. After this, the author investigates what we mean by ethics and the codes that corporate, governmental, and other organizations use to depict how they understand their relationship to the technology they create and deploy. The author explores three examples of frames of ethics and descriptions of technology, which though appearing progressive, once understood holistically, fail to adequately describe technology and its possible impact. The author ends this article with a discussion of the complexity of describing and communicating ethical uses of technology.

Keywords: culture, ethics, technology, language, framing

References(78)

1
S. Mattern, A city is not a computer, Places Journal, doi: 10.22269/170207.https://doi.org/10.22269/170207
DOI
2
B. Paris and J. Donovan, Deepfakes are troubling. But so are the ‘cheapfakes’ that are already here, https://slate.com/technology/2019/06/drunk-pelosi-deepfakes-cheapfakes-artificial-intelligence-disinformation.html, 2019.
3
S. Samuel, A guy made a deepfake app to turn photos of women into nudes. It didn’t go well, https://www.vox.com/2019/6/27/18761639/ai-deepfake-deepnude-app-nude-women-porn, 2019.
4

F. Ferreira, K. G. D. Bailey, and V. Ferraro, Good-enough representations in language comprehension, Current Direction in Psychological Science, vol. 11, no. 1, pp. 11–15, 2002.

5

F. Ferreira and N. D. Patson, The ‘Good Enough’ approach to language comprehension, Language and Linguistics Compass, vol. 1, no. 1&2, pp. 71–83, 2007.

6
M. D. Francois, C. George, and J. Stowell, Introducing Equiano, a subsea cable from Portugal to South Africa, https://cloud.google.com/blog/products/infrastructure/introducing-equiano-a-subsea-cable-from-portugal-to-south-africa/, 2019.
7
S. Shankland, Google’s third subsea cable will pump data from Portugal to South Africa, https://www.cnet.com/news/google-third-subsea-cable-equiano-connect-portugal-south-africa-nigeria/, 2019.
8
O. Equiano, The Life of Olaudah Equiano. New York, NY, USA: Cosimo, Inc., 2009.
9
J. P. Gee, An Introduction to Discourse Analysis: Theory and Method. London, UK: Taylor & Francis, 1999.
10
E. Goffman, Frame Analysis: An Essay on the Organization of Experience. Cambridge, MA, USA: Harvard University Press, 1974.
11
Y. Kazeem, Google and Facebook are circling Africa with huge undersea cables to get millions online, https://qz.com/africa/1656262/google-facebook-building-undersea-internet-cable-for-africa/, 2019.
12
P. Sawers, Google announces Equiano, a privately funded subsea cable that connects Europe with Africa, https://venturebeat.com/2019/06/28/google-announces-equiano-a-privately-funded-subsea-cable-that-connects-europe-with-africa/, 2019.
13

A. Appadurai, Disjuncture and difference in the global cultural economy, Theory, vol. 7, no. 2, pp. 295–310, 1990.

14

C. H. de Vreese, News framing: Theory and typology, Information Design Journal, vol. 13, no. 1, pp. 51–62, 2005.

15

R. M. Entman, Framing: Toward clarification of a fractured paradigm, Journal of Communication, vol. 43, no. 4, pp. 51–58, 1993.

16
W. A. Gamson, Talking Politics. New York, NY, USA: Cambridge University Press, 1992.
17
A. Tversky and D. Kahneman, Rational choice and the framing of decisions, in Multiple Criteria Decision Making and Risk Analysis Using Microcomputers, B. Karpak and S. Zionts, eds. Berlin, Germany: Springer, 1989, pp. 81–126.https://doi.org/10.1007/978-3-642-74919-3_4
DOI
18

M. Edelman, Contestable categories and public opinion, Political Communication, vol. 10, no. 3, pp. 231–242, 1993.

19

D. A. Scheufele, Framing as a theory of media effects, Journal of Communication, vol. 49, no. 1, pp. 103–122, 1999.

20

D. A. Scheufele, Agenda-setting, priming, and framing revisited: Another look at cognitive effects of political communication, Mass Communicaiton and Society, vol. 3, no. 2&3, pp. 297–316, 2000.

21

P. M. Napoli, Revisiting ‘mass communication’ and the ‘work’ of the audience in the new media environment, Media,Culture&Society, vol. 32, no. 3, pp. 505–516, 2010.

22

M. A. Cacciatore, D. A. Scheufele, and S. Iyengar, The end of framing as we know it … and the future of media effects, Mass Communication and Society, vol. 19, no. 1, pp. 7–23, 2016.

23

D. Chong and J. N. Druckman, Framing theory, Annual Review of Political Science, vol. 10, no. 1, pp. 103–126, 2007.

24

D. A. Scheufele and D. Tewksbury, Framing, agenda setting, and priming: The evolution of three media effects models, Journal of Communication, vol. 57, no. 1, pp. 9–20, 2007.

25
B. Green, The contestation of tech ethics: A sociotechnical approach to technology ethics in practice, Journal of Social Computing, doi:10.23919/JSC.2021.0018.
26

P. V. Lewis, Defining ‘business ethics’: Like nailing jello to a wall, Journal of Business Ethics, vol. 4, no. 5, pp. 377–383, 1985.

27

J. Fischer, Social responsibility and ethics: Clarifying the concepts, Journal of Business Ethics, vol. 52, no. 4, pp. 381–390, 2004.

28

D. Berdichevsky and E. Neuenschwander, Toward an ethics of persuasive technology, Communications of the ACM, vol. 42, no. 5, pp. 51–58, 1999.

29
W. J. Waluchow, The Dimensions of Ethics: An Introduction to Ethical Theory. Calgary, Canada: Broadview Press, 2003.
30
J. Driver, Consequentialism. Florence, KY, USA: Routledge, 2011.https://doi.org/10.4324/9780203149256
DOI
31
K. de Lazari-Radek and P. Singer, Utilitarianism: A Very Short Introduction. Oxford, UK: Oxford University Press, 2017.https://doi.org/10.1093/actrade/9780198728795.001.0001
DOI
32
S. Mhlambi, From rationality to relationality: Ubuntu as an ethical and human rights framework for artificial intelligence governance, Carr Center for Human Rights Policy Discussion Paper Series, https://carrcenter.hks.harvard.edu/publications/rationality-relationality-ubuntu-ethical-and-human-rights-framework-artificial, 2020.
33

C. A. Ellwood, The sociological basis of ethics, Int. J. Ethics, vol. 20, no. 3, pp. 314–329, 1910.

34
J. Lichtenberg, What are codes of ethics for? in Codes of Ethics and the Professions, M. Coady and S. Bloch, eds. Melbourne, Australia: Melbourne University Press, 1996, pp. 13–27.
35

G. Wood and M. Rimmer, Codes of ethics: What are they really and what should they be? International Journal of Value-Bsaed Management, vol. 16, no. 2, pp. 181–195, 2003.

36

M. S. Frankel, Professional codes: Why, how, and with what impact? Journal of Business Ethics, vol. 8, no. 2&3, pp. 109–115, 1989.

37

G. Vinten, Business ethics: Busybody or corporate conscience? Managerial Auditing Journal, vol. 5, no. 2, pp. 4–11, 1990.

38
L. Stark and A. L. Hoffmann, Data is the new what? Popular metaphors & professional ethics in emerging data culture, Journal of Cultural Analytics, doi: 10.22148/16.036.https://doi.org/10.22148/16.036
DOI
39
K. Conger, Google removes ‘Don’t Be Evil’ clause from its code of conduct, https://gizmodo.com/google-removes-nearly-all-mentions-of-dont-be-evil-from-1826153393, 2018.
40
D. Mayer, Why Google was smart to drop its ‘Don’t Be Evil’ motto, https://www.fastcompany.com/3056389/why-google-was-smart-to-drop-its-dont-be-evil-motto, 2016.
41

P. Sundstrom, Interpreting the notion that technology is value-neutral, Medicine Health Care and Philosophy, vol. 1, no. 1, pp. 41–45, 1998.

42
S. U. Noble, Algorithms of Oppression: How Search Engines Reinforce Racism. New York, NY, USA: New York University Press, 2018.https://doi.org/10.2307/j.ctt1pwt9w5
DOI
43
V. Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York, NY, USA: St. Martin’s Press, 2018.
44

L. Winner, Do artifacts have politics? Daedalus, vol. 109, no. 1, pp. 121–136, 1980.

45
M. R. Smith and L. Marx, Does Technology Drive History?: The Dilemma of Technological Determinism. Cambridge, MA, USA: MIT Press, 1994.
46
K. A. Gates, Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance. New York, NY, USA: New York University Press, 2011.https://doi.org/10.18574/nyu/9780814732090.001.0001
DOI
47
J. Woodward, C. Horn, J. Gatune, and A. Thomas, Biometrics: A Look at Facial Recognition. Santa Monica, CA, USA: RAND Corporation, 2003.
48
Y. Usigan, 7 surprising ways facial recognition is used, https://www.cbsnews.com/pictures/7-surprising-ways-facial-recognition-is-used/, 2011.
49
Amazon Rekognition – Video and Image – AWS, https://aws.amazon.com/rekognition/, 2019.
50
J. Buolamwini and T. Gebru, Gender shades: Intersectional accuracy disparities in commercial gender classification, in Proc. the 1st Conference on Fairness, Accountability and Transparency, New York, NY, USA, 2018, pp. 77–91.
51
C. Garvie, A. Bedoya, and J. Frankle, The perpetual line-up: Unregulated police face recognition in America, https://www.perpetuallineup.org/, 2016.
52

M. B. Kent, Pavesich, property and privacy: The common origins of property rights and privacy rights in Georgia, John Marshall Law Journal, vol. 2, no. 1, 2009.

53
H. Demsetz, Toward a theory of property rights, in Classic Papers in Natural Resource Economics, C. Gopalakrishnan, ed. London, UK: Palgrave Macmillan, 2000, pp. 163–177.https://doi.org/10.1057/9780230523210_9
DOI
54

J. B. Baron, Rescuing the bundle-of-rights metaphor in property law, University of Cincinnati Law Review, vol. 82, no. 1, pp. 57–102, 2014.

55

D. R. Johnson, Reflections on the bundle of rights lecture, Vermont Law Review, vol. 32, pp. 247–272, 2007.

56

C. I. Harris, Whiteness as property, Harvard Law Review, vol. 106, no. 8, pp. 1707–1791, 1993.

57
U.S. Supreme Court, Johnson’s Lessee v. McIntosh, United States Reports, vol. 21, pp. 543–605. 1823.
58

M. Armstrong, African Americans and property ownership: Creating our own meanings, redefining our relationships, African-American Law and Policy Report, vol. 1, pp. 79–88, 1994.

59
City of Detroit, Project Green Light Detroit, https://detroitmi.gov/departments/police-department/project-green-light-detroit, 2016.
60
W. Feuer, Controversial project Green Light comes to Corktown, https://www.metrotimes.com/news-hits/archives/2018/10/31/controversial-project-green-light-comes-to-corktown, 2018.
61
C. Garvie and L. M. Moy, America under watch: Face surveillance in the United States, https://www.americaunderwatch.com, 2019.
62
S. Neavling, Researchers alarmed by Detroit’s pervasive, expanding facial-recognition surveillance program, https://www.metrotimes.com/news-hits/archives/2019/05/17/researchers-alarmed-by-detroits-pervasive-expanding-facial-recognition-surveillance-program, 2019.
63
S. Neavling, A condescending Chief Craig breaks silence about Detroit’s facial-recognition technology, https://www.metrotimes.com/news-hits/archives/2019/06/27/a-condescending-chief-craig-breaks-silence-about-detroits-facial-recognition-technology, 2019.
64
L. Sweeney, A. Abu, and J. Winn, Identifying participants in the personal Genome Project by name (A reidentification experiment), arXiv preprint aXiv: 13047605, 2013.https://doi.org/10.2139/ssrn.2257732
DOI
65

J. Karat, Evolving the sope of uer-cntered dsign, Communication of the ACM, vol. 40, no. 7, pp. 33–38, 1997.

66

J. Gulliksen, B. Göransson, I. Boivie, S. Blomkvist, J. Persson, and Å. Cajander, Key principles for user-centred systems design, Behaviour and Information Technology, vol. 22, no. 6, pp. 397–409, 2003.

67
68
M. R. Dickey, The future of diversity and inclusion in tech, http://social.techcrunch.com/2019/06/17/the-future-of-diversity-and-inclusion-in-tech/, 2019.
69
C. C. Perez, Invisible Women: Data Bias in a World Designed for Men. New York, NY, USA: Abrams, 2019.
70

T. Miaskiewicz and K. A. Kozar, Personas and user-centered design: How can personas benefit product design processes? Design Studies, vol. 32, no. 5, pp. 417–430, 2011.

71

A. L. Massanari, Designing for imaginary friends: Information architecture, personas and the politics of user-centered design, New Media and Society, vol. 12, no. 3, pp. 401–416, 2010.

72
Digital Team, Creating a user-friendly way to find neighborhood resources, https://www.boston.gov/news/creating-user-friendly-way-find-neighborhood-resources, 2019.
73
S. Goldsmith and S. Crawford, The Responsive City: Engaging Communities Through Data-Smart Governance. Hoboken, NJ, USA: John Wiley & Sons Inc., 2014.
74
R. T. Lakoff, Talking Power: The Politics of Language. New York, NY, USA: Basic Books, 1990.
75
T. Telford, ‘The world is not yet ready for DeepNude’: Creator kills app that uses AI to fake naked images of women, https://www.washingtonpost.com/business/2019/06/28/the-world-is-not-yet-ready-deepnude-creator-kills-app-that-uses-ai-fake-naked-images-women/, 2019.
76
Walk the Moon, Talking is hard. RCA Records, no. 88843-09809-2, 2014.
77

C. Schwarz-Plaschg, The power of analogies for imagining and governing emerging technologies, NanoEthics, vol. 12, no. 2, pp. 139–153, 2018.

78
L. Hu, Tech ethics: Speaking ethics to power, or power speaking ethics? Journal of Social Computing, doi:10.23919/JSC.2021.0033.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 20 May 2021
Revised: 20 November 2021
Accepted: 25 November 2021
Published: 13 January 2022
Issue date: September 2021

Copyright

© The author(s) 2021

Acknowledgements

Acknowledgment

J. E. McNealy would like to acknowledge the collective wisdom of the Ethical Tech Collective and those who participate in the Ethical Tech Working Group.

Rights and permissions

The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).

Return