Digitisation and discrimination: Are algorithms the modern version of Long Island’s low-hanging bridges?

The proliferation of digital technology has meant that we now live in a world which is fully submerged in digital culture. This ‘state of affairs after the initial upheaval caused by comperization’ (Cramer, 2015: 14), is what Cramer refers to as the post-digital (2015). During the seminar this week, we contested the idea of technology being neutral, agreeing with Feenberg who states it ’embodies the values of a particular industrial civilisation’ (2002: 5). An example of this can be found in the low-hanging bridges on Long Island, which according to Winner ‘were deliberately designed to achieve a particular social effect (1986: 123), by denying  access to buses which the lower classes and black community relied on at the time. Winner believed that this intentionally resulted in a better experience for the white middle class who were free to enjoy the roads on their own (1986). Despite being designed in a pre-digital world, the low-hanging bridges show how machines, structures and systems can come to embody forms of authority and power.

(Larvalsubjects, 2018)

As digital technology has become increasingly intertwined into our everyday lives, it is important to reflect on how, where and why certain power dynamics present themselves in modern digital technology. There is a case to suggest that protocols, algorithms and AI decision making are the digital versions of the Long Island bridges, with a productive power to explain and produce the sociopolitical logics of contemporary civilisation. This becomes particularly pertinent when considering the growth of the scored society (McCann, Hall & Warin, 2018), whereby ‘access to public and private services are increasingly being mediated through algorithms’ (McCann, Hall & Warin, 2018) which uses personal data to decide whether certain digital profiles match the requirements for access. A growing trend of algorithms granting or denying access to services is evident in the health care industry, where patients are prioritised for surgical procedures based on their data profile. In theory, ‘the algorithm would help health systems allocate resources to achieve optimal health outcomes’ (Owens & Walker, 2020: 1327), however, this method has been revealed to reinforce health inequalities and produce racist decisions (McCann, Hall & Warin, 2018).

(Bloomberg Quicktake: Now, 2020)

A new study of kidney patients in Boston is one of the first to document the harm that can be caused when race is factored into algorithms. It examined ‘a widely used but controversial formula for estimating kidney function that by design assigns Black people healthier scores’ (Simonite, 2020), resulting in a third of Black patients being placed in a less severe health category than the white patients using the same formula (Simonite, 2020). In this instance, race was explicitly factored into the code to smooth out statistical differences in their data, however, other medical algorithms ‘do not use race as a predictor’ (Owens & Walker, 2020) in their models and still produce discriminatory results.

If we revisit the low-hanging bridges, a main criticism of Winner’s idea came from Joerges (1999), who stated that the bridges were not designed with the intention of being racist, highlighting that buses were excluded from those roads anyway (1999). It could be argued that if not done intentionally, the bridges simply mirrored the embedded racism in society. This relates to how algorithms digitally reproduce biases which have existed long before the technology itself, showing a crucial need to design them with actively anti-racist principles at their core.

References:

Bloomberg Quicktake: Now. (2020) ‘Are Algorithms Racist?’ [online video] Available at: https://www.youtube.com/watch?v=971CFnYrBgw [Accessed: 27 October]

Cramer F. (2015) What Is ‘Post-digital’?. In: Berry D.M., Dieter M. (eds) Postdigital Aesthetics. London: Palgrave Macmillan

Feenberg, A. (2002) Transforming Technology: a Critical theory revisited. 2nd ed. Oxford: Oxford University Press.

Joerges, B. (1999) Do politics have artefacts?. Social studies of science, 29(3), pp.411-431.

Larvalsubjects (2018) ‘Do Artefacts Have Politics?’ [online image] Available at: https://larvalsubjects.wordpress.com/2018/07/12/do-artifacts-have-politics/ [Accessed: 26 October 2020]

McCann, D., Hall, M., Warin, R. (2018) ‘Controlled By Calculations? Power and accountability in the Digital Economy’ in New Economics Foundation. [PDF] Available at: https://neweconomics.org/uploads/files/Controlled-by-calculations.pdf [Accessed: 25 October 2020]

Owens, K. and Walker, A. (2020) Those designing healthcare algorithms must become actively anti-racist. Nature medicine, 26(9), pp.1327-1328.

Simonite, T. (2020) ‘How an Algorithm Blocked Kidney Transplants to Black Patients’. Wired. [online] Available at: https://www.wired.com/story/how-algorithm-blocked-kidney-transplants-black-patients/ [Accessed: 25 October 2020]

Winner, L (1986) ‘Do Artefacts Have Politics?’ in The Whale and the Reactor: a Search for Limits in an Age of High Technology. Chicago, University of Chicago Press.

 

Leave a Reply

Your email address will not be published. Required fields are marked *