Radio Free never accepts money from corporations, governments or billionaires – keeping the focus on supporting independent media for people, not profits. Since 2010, Radio Free has supported the work of thousands of independent journalists, learn more about how your donation helps improve journalism for everyone.

Make a monthly donation of any amount to support independent media.





‘Trapped in a code’ – the fight over our algorithmic future

“We ain’t your target market, and I ain’t your target market”, sang the band Sisteray on their 2018 song Algorithm Prison. They don’t wish, as they put it, to be “trapped in a code”. Who would? Yet Sisteray know that we are all exposed to a powerful and judgmental data gaze – as the camera zooms out at the end of their official video, the bandmembers all stare blankly at their phones, the words target market etched into the dirt on the back windscreen of their van.

The song shows how algorithms have risen in fame and notoriety in the last few years. And it is illustrative of a widely held concern over what these algorithms are doing to us. Often depicted as shadowy and constraining structural forces, algorithms are a source of significant anxiety. People often worry that these bits of code have a powerful but unknown sway over their lives. With an algorithmic grading system implemented to produce the recent A Level results, this anxiety is something we have seen magnified in recent days.

It seems that there was an implicit assumption on the part of those awarding the grades, that simply evoking the concept of the algorithm would be enough to reassure people of the objectivity and systematic fairness of the results. The calculative logic behind algorithmic decisions is often based on ideals of objectivity, neutrality and accuracy. In assuming that others would also see algorithms in those positive terms, the existing unease and scepticism about such systems was missed.

The backlash against how an algorithm was used to decide people’s exam results and the unevenness of the outcomes for people from different backgrounds gives us an insight into the ongoing tussle over our algorithmic future. This standardisation of grades tells us something about what we are willing to tolerate when being judged by algorithms.

Where we are conscious of their presence, algorithms are mostly tolerated rather than celebrated. There’s often a kind of grudging awareness and acceptance – although feelings toward algorithms already go well beyond uneasiness for those on the sharp-end of automated decisions, especially where prejudice, discrimination and inequality are coded into what Safiya Umoja Noble has called ‘algorithms of oppression’.

Perhaps the most widely understood algorithmic processes to date, however, are those associated with tech platforms and social media. It is here that algorithms are most noticeably active in making predictions for us and about us, and where media coverage has tended to focus. But it turns out that those being examined did not wish their results to be calculated in a similar way to the selection of their next YouTube clip, Netflix film, TikTok video, Instagram picture feed or Spotify song. The models are clearly totally different, but this is the parallel that all of the talk of algorithms was likely to draw. Other people’s data are considered ok for predicting cultural consumption, but the same cannot be said for predicting educational attainment.

It seems that there are forms of algorithmic prediction that are considered to be acceptable and others, clearly, that are not. All automation creates tensions, but some decisions or predictions (“here are the grades the algorithm says you would have got”) appear to break beyond the tacitly established limits of broad acceptability.

There is a concern, as expressed in that Sisteray song, that algorithms are a kind of trap and that they routinely use our data to lock us into fixed patterns. In his recent book on the long development of such systems, the historian Colin Koopman describes how data gathering processes have the effect of ‘fastening’. Both the boxes we are put in, and the gaps in the forms that are completed about us, hold us in place, he argues – with many of the categories and logics behind data usage having been established a century or more ago. This fastening process is also echoed in Deborah Lupton’s recent notion of Data Selves, in which we have become inseparable from our data. Both Koopman and Lupton point to how data are used to make us up. Recent events could be seen as a rejection of one aspect of that fastening. By reducing these students simply to a data point within a historic cohort, the loss of a sense of the individual was overpowering. The students were not happy about being fastened in place by these particular algorithmic processes.

When you combine the existing scepticism for algorithms with an algorithmic system that is so overt in its uneven outcomes, then this level of reaction was always likely. It seems that the adjusted results will not now stand, but the full impact of what has happened is not yet clear. What is clearer is that the notion or concept of the algorithm will continue to be a site of tension. In such a context, a crude belief in algorithms is unlikely to go unchallenged. It would seem that many of these A Level students agree with Sisteray – they don’t wish to be trapped in code.

Print
Print Share Comment Cite Upload Translate Updates

Leave a Reply

APA

David Beer | Radio Free (2020-08-21T10:56:05+00:00) ‘Trapped in a code’ – the fight over our algorithmic future. Retrieved from https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/

MLA
" » ‘Trapped in a code’ – the fight over our algorithmic future." David Beer | Radio Free - Friday August 21, 2020, https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/
HARVARD
David Beer | Radio Free Friday August 21, 2020 » ‘Trapped in a code’ – the fight over our algorithmic future., viewed ,<https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/>
VANCOUVER
David Beer | Radio Free - » ‘Trapped in a code’ – the fight over our algorithmic future. [Internet]. [Accessed ]. Available from: https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/
CHICAGO
" » ‘Trapped in a code’ – the fight over our algorithmic future." David Beer | Radio Free - Accessed . https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/
IEEE
" » ‘Trapped in a code’ – the fight over our algorithmic future." David Beer | Radio Free [Online]. Available: https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/. [Accessed: ]
rf:citation
» ‘Trapped in a code’ – the fight over our algorithmic future | David Beer | Radio Free | https://www.radiofree.org/2020/08/21/trapped-in-a-code-the-fight-over-our-algorithmic-future/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.