The New Observer UK Society Computer says No

Computer says No


Circuit Board

Heidegger wrote about the dangers of a technological view of the world taking over. [1] The idea that we are in danger of becoming controlled by robots may seem crazy – the stuff of Science Fiction. But, no, it is really already happening. Here are two recent examples:

In recent days there has been a political scandal about how A-level grades will be awarded in the UK. Given that students did not take the exams because of Coronavirus some system needs to be found to come up with grades. The system the government and the exams regulator came up with used grades predicted by teachers and an algorithm based on the past performance of schools. Some students lost out and there was an outcry. (As a tangential comment we can note that the A level system is designed to grade people. Part of the outcry is misplaced; the system is bound to create losers however the numbers are diced. That is the main purpose of the system). Nonetheless – there is a chilling element to this. The grades were to have been created by an algorithm and not connected to individuals at all. This is a totally depersonalising algorithm. The maw of the machine. In the end the system was rejected. The algorithm was designed in a very crude way and apparently produced untenable results. But it was tried.

Another example I’ve noticed is not new however. [2] This concerns a case which originally happened in 2014. A scandal arose concerning a private company which had been conducting English language tests for UK Visa applicants. A media programme exposé showed that there had been cheating on a widespread scale. The private company was asked by the government to check their results. Using an automated system the company did a comparative voice analysis and on this basis gave the government a list of students who were said to have cheated. According to the Independent report [2] students had their Visas revoked purely on the basis of this automated analysis. Subsequent manual analysis suggested that the automated system had been wrong in 20% of the cases. (The Independent report lacks some details about how this secondary investigation was done; possibly by lawyers representing some of the students in court). This is chilling. Decisions of enormous import for people’s lives are being made purely on the basis of an algorithm; a code which ultimately must chose an arbitrary cut-off point for when two recordings are said to come from the same voice and when not. Like any system for the management of a mass number of accounts to function at all and achieve its purpose – of saving money and increasing profit by automating a task – it has to accept that some edge cases will be wrong. If this is your broadband company calculating your expected usage to decide which adverts to send you that is one matter. But in a case like this when people’s lives were being overturned it is a different matter. Again; the lack of compassion is allowing such a system to be used to make decisions of this import is chilling. The people who permitted this must or should have known that there would be a certain percentage of cases which would be wrongly decided. 

As Heidegger says is not technology which is the problem. It is how it is used. And it seems there really are people who are prepared to put it to work in a totally heartless way.

Notes

  1. https://www.youtube.com/watch?v=MtATDlUSIx
  2. https://www.independent.co.uk/news/uk/politics/home-office-mistakenly-deported-thousands-foreign-students-cheating-language-tests-theresa-may-a8331906.html
[Image: Photo: Harland Quarrington/MOD / OGL v1.0 (http://NationalArchives.gov.uk/doc/open-government-licence/version/1/) - https://commons.wikimedia.org/wiki/File:Computer_Circuit_Board_MOD_45153623.jpg]