0% found this document useful (0 votes)
52 views43 pages

MMCC3031 Lecture 12

gg

Uploaded by

zara.smith2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views43 pages

MMCC3031 Lecture 12

gg

Uploaded by

zara.smith2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

MMCC3031 FORENSIC

MEDIA

FORENSIC BIOMETRICS
Department of Media, Communications, Creative Arts, Literature and
Languages

Professor Joseph Pugliese


Macquarie University
Content Warning
Aboriginal and Torres Strait Islander viewers are
respectfully advised that this slide presentation contains
names and references to deceased Indigenous persons.

Please be advised that this lecture also contains disturbing


material related to acts of state violence.
Lecture Itinerary
Introduction to biometrics, biometric templates and
enrolment.

Colonial and racialised history of biometrics.

Raciality of biometrics, white templates and infrastructural


whiteness.

Biopolitics of biometrics and the biometric state.

Forensic uses of biometrics.

Biometrics social sorting.


Biometrics
Biometrics are technologies that visually scan a subject’s
features, such as their fingers, iris or face.

The process of having your physical features scanned is


called biometric enrolment.
Biometric Templates
From this scan, the biometric system produces what is called
a biometric template, which is generated by the algorithmic
encoding of a subject’s distinctive physical features, so that
physical attributes are digitally transmuted into binary code.
Biometric Databases
The biometric template is stored in a database, which works
like a digital archive, and is used to verify a subject’s identity
every time they present themselves for biometric screening.
Every fresh biometric scan is checked against the initial
enrolment template in order to verify that it matches the
original biometric scan in the database.
Colonial and racialised history
of biometrics
Biometrics has a long colonial and racialised history of
imperial powers deploying anthropometry, or the scientific
measurement of bodies, in order to identify and govern
colonial populations.

The first biometric technology was the technology of inked


fingerprinting.
Biometrics’ Racialised and
Colonial Histories
The inked fingerprint was first
developed by the British in
colonial India as a way of
attempting to govern a vast
country with a large population.

The biometric technology of the


fingerprint was developed as a
racial technology in order to
construct a colonial system of
identification and surveillance of
a subject populations in the face
of British administrators who
couldn’t distinguish one person
of Colour from another, i.e., the
racist syndrome of people of
Colour looking all alike or ‘racial
homogeneity.’
Biometric Racial Profiling
In British colonial India, any group that attempted to challenge
British colonial rule was designated as a ‘criminal tribe’ and each
member of the group was compelled to undergo fingerprinting as a
way of keeping a record of the identity of these ‘suspect’ subjects.
See Simon A. Cole, Suspect Identities: A History of Fingerprinting and Criminal Identification. Cambridge,
MA: Harvard University Press, 2002.

Thus, what was operative here was a form of racial profiling that
deemed any group declared by the British to be ‘suspect’ as always
already ‘criminal’ before the fact of having actually committed any
crime.

Biometric racial profiling is operative today in the discrimination


and criminalization of people of Colour. See
https://www.theguardian.com/technology/2017/dec/04/racist-
facial-recognition-white-coders-black-people-police
Finger printing and Australia’s
colonial history
The colonial government in Australia also deployed the
inked fingerprinting system to record, surveil and govern
Aboriginal people.

Most infamously, in the process of forcefully taking


Aboriginal children from their mothers during the Stolen
Generations era, it often forced Aboriginal mothers, who
could not speak or read English, to sign the ‘release’ of their
child to white authorities with an inked fingerprint as a
evidence of their ‘informed consent.’
The Cubillo Case
Aboriginal mothers had no idea that they were authorising the
removal of their children by placing their fingerprint on the state’s
document. Lorna Cubillo, a child of the Stolen Generations,
challenged the legality of this violent practice in the Cubillo case.

In the Cubillo case, Peter Gunner, a child of the Stolen


Generations, argued that he had been unlawfully removed from his
mother as his mother had no idea that her fingerprint was being
used as a sign of her consent for the authorities to take him away,
separate him from his family and place him in an institutional
home to be assimilated into white ways.

See: https://www.abc.net.au/news/2020-09-14/nt-stolen-
generations-campaigner-lorna-cubillo-dies-aged-81/12659384
Raciality of Biometrics
The colonial origins of biometrics demonstrates the way in which
the category of race was embedded in this technology right from
the start.

The raciality of biometrics is still evident in contemporary


biometric technologies as evidenced by examples of Black people
and people of Colour not being able biometrically to enrol because
of the inbuilt racial bias in these technologies. By not being able to
biometrically enrol, they are thereby excluded or blocked from
entering relevant facilities or accessing relevant databases.

See
https://www.washingtonpost.com/technology/2019/12/19/feder
al-study-confirms-racial-bias-many-facial-recognition-systems-casts-
doubt-their-expanding-use/
Raciality of Biometrics and
Failure to Enrol
For example, Asian women have
failed biometrically to enrol because
the image acquisition system cannot
read the ridges on their fingers.

Similarly, Black people have failed


biometrically to enrol because the
image acquisition system cannot read
their faces for facial scans because the
system can only capture ‘black blobs,’
that is, it cannot produce images of
their distinct facial features.

See
https://www.raconteur.net/technolog
y/biometrics-ethics-bias/
Infrastructural Whiteness
What these examples evidence is that, even though biometric
advocates praise biometric technologies as being ‘impartial’
and ‘objective’ in not, like humans, bringing to bear any
racial bias in their processing of scanned subjects, the actual
technologies are actually inscribed with a racial bias towards
whiteness.
White Templates
The racial bias towards white subjects is manifested by the fact that the image
acquisition properties of the biometric technology in question have had such
things as lighting calibrated against the reflective properties of white skin only.

Such technologies, then, can be seen to be infrastructurally calibrated to whiteness,


i.e., whiteness is configured as the universal gauge that determines the technical
settings and parameters for the visual imaging and capture of a subject’s
features.
White ‘Universal Human’
Templates
White subjects, as we’ve discussed in previous lectures, thus
emerge as the template and universal human subjects against
which all other racial groups are measured, evaluated and
structurally marginalised, e.g., Pioneer space probe and its
‘universal’ white humans.
UBER case of facial recognition
Biometrics’ Racialised
Algorithms
The fact that biometric technologies’ image acquisition systems are calibrated to the skin
of white people demonstrates that the algorithms that drive their digital systems are set to
code for whiteness. The algorithms, in other words, are not ‘neutral’ or ‘objective’ but,
rather, are inscribed with the racialised biases of the software designers.

This infrastructural software/algorithmic racial bias has real-world impact on Black and
Coloured people.

“In a 2015 scandal, Google’s facial recognition technology tagged two black American users
as gorillas due to biased inputs and incomplete training.

In another example from 2018, a facial recognition tool used by law-enforcement


misidentified 35% of dark-skinned women as men. The error rate for light-skinned men was
only 0.8%.

At a time when police brutality in the United States is at a peak, we can see how this biased
data could lead to disastrous, even violent results.” Amanda Fawcett, “Understanding racial bias in
machine learning algorithms,”Educative, June 08, 2020, https://www.educative.io/blog/racial-bias-machine-learning-
algorithms
Racialised
hardware/software split
When we examine the actual
production of digital technologies, we
find a perverse racialised
hardware/software split.

The bulk of digital technologies are


made and assembled in the countries
of the Global South, and usually by
Asian women, because of their so-
called ‘nimble fingers.’ Thus, they are
crucial to the production of the
hardware of digital technologies.

Yet, by being excluded, for example,


from biometric enrolment because of
infrastructural whiteness, they are
excluded at the software/operating
systems level of these technologies
that cannot read their features.
Racialised “Integrated Circuits”
Martin Kevorkian discusses In his discussion of the US
this racialised computer industry, Kevorkian
hardware/software split in his exposes how African
analysis of how Black people Americans constitute the bulk
and people of Colour supply of the population of US
the labour for the actual prisons and how Black
production and service of prisoners are compelled to
computer technologies in work as on-call service
conditions that are either providers for the computer
tantamount to prisons industry in conditions that
(offshore assembly plants in reproduce the racialised
Asia) or are actual prisons slavery of the past.
(African American prisoners
working from US prisons). See Martin Kevorkian, Color
Monitors: The Black Face of
Technology in America. Ithaca and
London: Cornell University Press,
2006, pp. 74-114.
Biopolitics of Biometrics
Michel Foucault, French theorist, has developed a critical
understanding of the relation between the exercise of
political power and bodies. He terms this relation
‘biopolitics.’
Biopolitics refers to the manner in which a state exercises
control and governance of its subjects by documenting the
individual physical attributes of a subject’s body through the
use of a variety of visual technologies, including biometrics,
photography and so on.*

* See Foucault Michel ‘Society Must Be Defended’: Lectures at


the Collège de France, 1975-1976, New York; Picador.
The Biopolitical Biometric State

The biopolitical state works to affix the name and identity of


a subject to the visual scan of its subjects and then to store
this biodata in large databases.

In this way, the state can surveil, control and monitor the
movement and activity of its subjects.

Biometric data is used by such state institutions as


immigration and border control, social security, the police
and so on.
Biopolitics of Biometric Racial
Profiling
“Racialized code: Experts such as Joy Buolamwini, a researcher at the MIT
Media Lab, think that facial recognition software has problems recognizing
black faces because its algorithms are usually written by white engineers who
dominate the technology sector. These engineers build on pre-existing code
libraries, typically written by other white engineers.”

A report on police biometric racial profiling of African Americans has “found


that black individuals, as with so many aspects of the justice system, were the
most likely to be scrutinized by facial recognition software in cases. It also
suggested that software was most likely to be incorrect when used on black
individuals – a finding corroborated by the FBI’s own research. This
combination, which is making Lynch’s and other black Americans’ lives
excruciatingly difficult, is born from another race issue that has become a
subject of national discourse: the lack of diversity in the technology sector.”

See, Ali Breland, “How white engineers built racist code – and why it's
dangerous for black people,” The Guardian, 4 December 2017,
https://www.theguardian.com/technology/2017/dec/04/racist-facial-
recognition-white-coders-black-people-police
Forensics of Biometrics
Precisely because biometric technologies capture and store
the identificatory features of a subject, such as their
fingerprints, they can be said to produce an indexical relation
between the biometric template and the biometrically
enrolled subject.

That is, the biometric template can be matched against the


features of a subject in order to verify or authenticate their
identity.
When there is a match between the template stored in the
database and the fresh biometric scan, then the identity of
the subject can be verified.
Biometric Legal Subjecthood
A person’s biometric template, then, can be said to constitute a person’s
legal subjecthood or identity, i.e., it proves who you say you are or
proves that you are not actually who you say you are when your
template doesn’t match your biometric scan.

Thus, biometrics is now being used by the police, immigration


authorities and social security agencies to verify the identities of
subjects.
Immigration and border control
biometrics: passports
Biometric social security cards
Biometric police uses
Digital Social Sorting
David Lyon, surveillance studies theorist, has coined the
term ‘digital social sorting’ in order to describe the manner in
which digital identification systems such as biometrics can
work to reproduce unequal relations of power that advantage
some subjects while disadvantaging others according to
perceived levels of so-called ‘risk assessment.’*

If a person is perceived as having a high ‘risk assessment’


based, for example, on their ethnicity, they will be compelled
to undergo extra and intensive security checks at airports,
e.g., ‘flying while Muslim,’ or ‘driving while Black.’
David Lyon, Surveillance as Social Sorting: Privacy, Risk and Automated Discrimination, New York:
Routledge, 2002.
Racialised Social Sorting at
airports
The 9/11 Terrorist Attacks
On 11 September 2001, a
group of al-Qaeda terrorists
hijacked four passenger
planes, crashing two of the
planes into New York’s
Twin Towers and killing
over 2000 civilians.

Immediately after the attack,


law enforcement agencies
scanned all available CCTV
footage to see if they could
identify the faces of the
terrorists.
Visual technologies of the biopolitical state:
9/11 ‘faces of terror’ captured on CCTV
Biometrics’ ‘faces of terror’
The identification of the 9/11 faces of terror worked to
‘conjure up the idea of an amorphous, racialized, and
fetishized enemy Other that had penetrated both the national
territory and the national imagination.’
The ‘“faces of terror” metaphor invoked specific objects:
mug-shots and grainy video images of Arab men.’

The ‘idea that certain faces could be inherently “faces of


terror” – that individuals embody terror or evil on their faces
– could not help but invoke a paranoid discourse of
racialized otherness.’*
*Kelly Gates 2006, ‘Identifying the 9/11 “Faces of Terror,”’ Cultural Studies, 20.4-5: 434.
Contemporary imperial uses of
biometrics
In Iraq and Afghanistan, the occupying US army has
deployed biometric technologies through a program called
the Biometrics Automated Toolset (BAT) to scan and store
the features of the male citizens of those two countries.

Biometrics, in this context, functions as a US technology of


surveillance and control of the male populations of these two
imperially occupied countries.
US army biometrically scanning
men in Afghanistan and Iraq
Biometric Biocriminals
Digital discrimination means
that those categorised as
‘suspect’ are criminalised in
advance of actually having
committed any offence.

Biopolitically, they are framed


as types of biocriminals: you
are always already a criminal
merely because of your
ethnicity or how you look.

For example, the racialised


descriptor ‘of Middle Eastern
appearance’ = terrorist.
Biometrics as a technology of
imperial governance
The US military’s use of BAT biometrics functions as a
technology of imperial governance of subject populations.

It evidences the existence of an anatomy of biopolitical


power that is technologically networked through interlinked
databases (in the US, Afghanistan and Iraq) in order to
facilitate ever more rigorous imperial regimes of
governmentality and surveillance.
Biopolitics of biometrics:
asylum seekers and refugees
Both Australia and the European Union have deployed
biometrics at the border, compelling refugees and asylum
seekers to biometrically enrol before their claims for asylum
can be processed.

Australian legislation allows border officials to use force if


asylum seekers are reluctant to have their features
biometrically scanned.

Thus, the most vulnerable of subjects, refugees fleeing


persecution and violence, are profiled through these practices
as both ‘suspect’ and potentially ‘criminal.’
Biometric scanning of refugees
Biometrically Embodied Border

The EU uses the Eurodac biometric system to keep a massive


database of all enrolled refugees and asylum seekers
attempting to enter its borders or already within its borders.

The border, then, for refugees and asylum seekers is not only
some external geographical and national boundary; rather, it
is now biometrically embodied, i.e., the border is now
biometrically inscribed on the refugee’s body.

The border, for a refugee, is thus mobile as it follows the


movements of an asylum seeker or refugee as they move
across national borders and checkpoints.
Biometrically criminalising
asylum seekers and refugees
Once a refugee’s biometric
data enters a biometric
database such as Eurodac, it
means that they are always
under a form of surveillance
and border control.

In despair, some refugees


have actually attempted to
erase the identifying features
of their fingerprints by cutting
away at the ridges on their
fingers so that the biometric
system cannot read their
fingerprints.
Conclusion

As a technology of identification, surveillance and control, biometrics is


embedded in colonial, imperial and racialised histories.

Contemporary biometric technologies are underpinned by the use of the


white subject as a type of universal template that sets the image acquisition
algorithms of the technology. Consequently, many biometric systems fail
to read for the features of Black people and people of Colour, who are thus
not enabled to enrol in biometric systems and are excluded from accessing
legitimate sites and services.

This colonial and imperial history informs contemporary uses by Western


states to use biometrics to criminalise racialised subjects in theatres of wars
(e.g., Afghanistan) and at the border (e.g., asylum seekers and refugees).
Conclusion continued
Because biometric technologies can be used to identify
and authenticate a person’s legal identity, they are
invested with a forensic evidentiary power and they can
be productively used to solve criminal cases.
As such, biometric technologies have been deployed by
the contemporary biopolitical state to surveil, monitor
and govern its subjects.
Biometrics must also be seen as working to produce
unequal power relations of digital discrimination and
social sorting, e.g., the ‘trusted traveller’ and the
‘immigrant traveller.’

You might also like