MMCC3031 Lecture 12
MMCC3031 Lecture 12
MEDIA
FORENSIC BIOMETRICS
Department of Media, Communications, Creative Arts, Literature and
Languages
Thus, what was operative here was a form of racial profiling that
deemed any group declared by the British to be ‘suspect’ as always
already ‘criminal’ before the fact of having actually committed any
crime.
See: https://www.abc.net.au/news/2020-09-14/nt-stolen-
generations-campaigner-lorna-cubillo-dies-aged-81/12659384
Raciality of Biometrics
The colonial origins of biometrics demonstrates the way in which
the category of race was embedded in this technology right from
the start.
See
https://www.washingtonpost.com/technology/2019/12/19/feder
al-study-confirms-racial-bias-many-facial-recognition-systems-casts-
doubt-their-expanding-use/
Raciality of Biometrics and
Failure to Enrol
For example, Asian women have
failed biometrically to enrol because
the image acquisition system cannot
read the ridges on their fingers.
See
https://www.raconteur.net/technolog
y/biometrics-ethics-bias/
Infrastructural Whiteness
What these examples evidence is that, even though biometric
advocates praise biometric technologies as being ‘impartial’
and ‘objective’ in not, like humans, bringing to bear any
racial bias in their processing of scanned subjects, the actual
technologies are actually inscribed with a racial bias towards
whiteness.
White Templates
The racial bias towards white subjects is manifested by the fact that the image
acquisition properties of the biometric technology in question have had such
things as lighting calibrated against the reflective properties of white skin only.
This infrastructural software/algorithmic racial bias has real-world impact on Black and
Coloured people.
“In a 2015 scandal, Google’s facial recognition technology tagged two black American users
as gorillas due to biased inputs and incomplete training.
At a time when police brutality in the United States is at a peak, we can see how this biased
data could lead to disastrous, even violent results.” Amanda Fawcett, “Understanding racial bias in
machine learning algorithms,”Educative, June 08, 2020, https://www.educative.io/blog/racial-bias-machine-learning-
algorithms
Racialised
hardware/software split
When we examine the actual
production of digital technologies, we
find a perverse racialised
hardware/software split.
In this way, the state can surveil, control and monitor the
movement and activity of its subjects.
See, Ali Breland, “How white engineers built racist code – and why it's
dangerous for black people,” The Guardian, 4 December 2017,
https://www.theguardian.com/technology/2017/dec/04/racist-facial-
recognition-white-coders-black-people-police
Forensics of Biometrics
Precisely because biometric technologies capture and store
the identificatory features of a subject, such as their
fingerprints, they can be said to produce an indexical relation
between the biometric template and the biometrically
enrolled subject.
The border, then, for refugees and asylum seekers is not only
some external geographical and national boundary; rather, it
is now biometrically embodied, i.e., the border is now
biometrically inscribed on the refugee’s body.