IEEE P7003TM Standard For Algorithmic Bias Considerations
IEEE P7003TM Standard For Algorithmic Bias Considerations
38
Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on December 01,2023 at 13:52:18 UTC from IEEE Xplore. Restrictions apply.
FairWare’18, May 2018, Gothenburg, Sweden A. Koene et al.
A brief paper outlining the aims of IEEE P7003 and its Throughout the development process IEEE P7003 challenges
relationship to the other IEEE P700x series standards working the developer to think explicitly about the criteria that are being
groups was published in [3] and a tech-industry oriented summary used for the recommendation process and the rationale, i.e.
of the eleven IEEE P70xx series standards appeared on the justification, for why these criteria are relevant and why they are
technology-industry blog TechEmergence [4]. appropriate (legally and socially). Documenting these will help
In this paper we present a more detailed overview of the the business respond to possible future challenges from
scope, structure and development process of the IEEE P7003 customers, competitors or regulators regarding the
Standard for Algorithmic Bias Considerations [5]. recommendations produced by this system. At the same time, this
IEEE P7003 is aimed to be used by people/organizations who process of analysis will help the business to be aware of the
are developing and/or deploying automated decision (support) context for which this recommendation system can confidently be
systems (which may or may not involve AI/machine learning) that used, and which uses would require additional testing (e.g. age
are part of products/services that affect people. Typical examples ranges of customers, types of products).
would include anything related to personalization or individual
assessment, including any system that performs a filtering 2 SCOPE
function by selecting to prioritize the ease with which people will
The IEEE P7003 standard will provide a framework, which
find some items over others (e.g. search engines or
helps developers of algorithmic systems and those responsible for
recommendation systems). Any system that will produce different
their deployment to identify and mitigate unintended, unjustified
results for some people than for others is open to challenges of
and/or inappropriate biases in the outcomes of the algorithmic
being biased. Examples could include:
system. Algorithmic systems in this context refers to the
x Security camera applications that detect theft or suspicious combination of algorithms, data and the output deployment
behaviour. process that together determine the outcomes that affect end users.
x Marketing automation applications that calibrate offers, Unjustified bias refers to differential treatment of individuals
prices, or content to an individual’s preferences and based on criteria for which no operational justification is given.
behaviour. Inappropriate bias refers to bias that is legally or morally
x etc… unacceptable within the social context where the system is used,
e.g. algorithmic systems that produce outcomes with differential
The requirements specification provided by the IEEE P7003 impact strongly correlated with protected characteristics (such as
standard will allow creators to communicate to users, and race, gender, sexuality, etc).
regulatory authorities, that up-to-date best practices were used in The standard will describe specific methodologies that allow
the design, testing and evaluation of the algorithm to attempt to users of the standard to assert how they worked to address and
avoid unintended, unjustified and inappropriate differential impact eliminate issues of unintended, unjustified and inappropriate bias
on users. in the creation of their algorithmic system. This will help to
Since the standard aims to allow for the legitimate ends of design systems that are more easily auditable by external parties
different users, such as businesses, it should assist them in (such as regulatory bodies).
assuring citizens that steps have been taken to ensure fairness, as
appropriate to the stated aims and practices of the sector where the Elements include:
algorithmic system is applied. For example, it may help customers x a set of guidelines for what to do when designing or using
of insurance companies to feel more assured that they are not such algorithmic systems following a principled
getting a worse deal because of the hidden operation of an methodology (process), engaging with stakeholders (people),
algorithm. determining and justifying the objectives of using the
As a practical example, an online retailer developing a new algorithm (purpose), and validating the principles that are
product recommendation system might use the IEEE P7003 actually embedded in the algorithmic system (product);
standard as follows: x a practical guideline for developers to identify when they
Early in the development cycle, after outlining the intended should step back to evaluate possible bias issues in their
functions of the new system IEEE P7003 guides the developer systems, and pointing to methods they can use to do this;
through a process of considering the likely customer groups, in x benchmarking procedures and criteria for the selection of
validation data sets for bias quality control;
order to identify if there are subgroups that will need special
consideration (e.g. people with visual impairments). In the next x methods for establishing and communicating the application
boundaries for which the system has been designed and
phase of the development, the developer is establishing a testing
validated, to guard against unintended consequences arising
dataset to validate if the system is performing as desired. from out-of-bound application of algorithms;
Referencing P7003 the developer is reminded of certain methods
x methods for user expectation management to mitigate bias
for checking if all customer groups are sufficiently represented in due to incorrect interpretation of systems outputs by users
the testing data to avoid reduced quality of service for certain (e.g. correlation vs. causation), such as specific action
customer groups. points/guidelines on what to do if in doubt about how to
interpret the algorithm outputs;
39
Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on December 01,2023 at 13:52:18 UTC from IEEE Xplore. Restrictions apply.
FairWare’18, May 2018, Gothenburg, Sweden A. Koene et al.
40
Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on December 01,2023 at 13:52:18 UTC from IEEE Xplore. Restrictions apply.
FairWare’18, May 2018, Gothenburg, Sweden A. Koene et al.
(4), Social Sciences (3), Arts (2) and Natural Sciences (1) 1 . In developing the standard was first submitted, the standard is
light of the nature of the topic of the P7003 standard, dealing with published for use.
bias/discrimination, the working group also expressed special
concerns about establishing sufficient cultural diversity in its 5 CONCLUSION
participants. As of early 2018 the participants who chose to
As part of the IEEE Global Initiative on Ethics of Autonomous
indicate their geographic location were from: USA (11), UK (6),
and Intelligent Systems a series of eleven ethics standards are
Canada (3), Germany (3), Brazil (2), India (2), Japan (2), the
under development, designated IEEE P7000 through IEEE P7010.
Netherlands (2), Australia (1), Belgium (1), Israel (1), Pakistan
As outlined in this paper, the IEEE P7003 Standard for
(1), Peru (1), Philippines (1), S. Korea (1) and Uganda (1); clearly
Algorithmic Bias Considerations aims to provide an actionable
indicating a strong N. America / W. Europe bias that has not yet
framework for improving fairness of algorithmic decision-making
been resolved. With respect to types of employers, the participants
systems that are increasingly being developed and deployed by
are roughly separated into 1/3 academics, 1/3 industry and 1/3
industry, government and other organizations. The IEEE P7003
civil-society affiliations.
standard is currently transitioning from an initial exploratory
During the first eight months, the work of developing the
phase into a consolidation and specification phase. Participation in
standard focused on growing the participant membership and on
the IEEE P7003 working group is open to all who are interested in
exploratory discussions during the monthly conference calls to
contributing towards reducing and mitigating unintended,
identify possible factors and sections that could be of relevance
unjustified and societally unacceptable bias in algorithmic
for including in the standard. Much of this centered on the
decisions.
foundational sections, which were mostly proposed by working
Minutes of recent IEEE P7003 working groups meetings are
group members as a result of these discussions. In the time
available at [3].
between the monthly meetings, working group members are
encouraged to develop the document content. During this initial
exploratory phase detailed document development was initiated
ACKNOWLEDGMENTS
primarily for two of the foundational sections, ‘Taxonomy of The participation of Ansgar Koene (working group chair) and Liz
Bias’ and ‘Legal frameworks related to Bias’. Dowthwaite (secretary) in the IEEE P7003 Standards
As of January 2018, the standard development process has development process forms part of the UnBias project supported
transitioned into the next phase, moving from the initial by EPSRC grant EP/N02785X/1. UnBias is an interdisciplinary
exploration of the problem space towards consolidation and research project led by the University of Nottingham in
specification of the standard document content. All P7003 collaboration with the Universities of Oxford and Edinburgh. For
working group members are asked to identify document sections more information about the UnBias project, see
that they will take primary responsibility for, with the aim of http://unbias.wp.horizon.ac.uk/
having teams of at least two participants for each section. The
monthly conference calls will focus on providing updates from REFERENCES
each of the teams to the complete working group regarding their [1] The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems.
https://ethicsinaction.ieee.org/
progress during the intervening month and any issues that might [2] Ethically Aligned Design: A Vision for Prioritizing Human Well-being with
require input from other teams. This will also be the primary Autonomous and Intelligent Systems, Version 2. IEEE, 2017.
http://standards.ieee.org/develop/indconn/ec/autonomous_systems.html
opportunity for all other working group members to raise [3] Ansgar Koene. Algorithmic Bias: Addressing Growing Concerns. IEEE
questions, make suggestions and/or volunteer to (temporarily) Technology and Society Magazine, 26, 2, (June 2017), 31-32. DOI:
http://dx.doi.org/10.1109/MTS.2017.2697080
contribute to the work of another team. [4] Daniel Fagella, The Ethics of Artificial Intelligence for Business Leaders –
Once the IEEE P7003 draft document is completed and Should Anyone Care? TechEmergence, December 9, 2017.
approved by the IEEE P7003 working group, it will be submitted https://www.techemergence.com/ethics-artificial-intelligence-business-leaders/
[5] IEEE P7003 Working Group http://sites.ieee.org/sagroups-7003/
for balloting approval to the IEEE-SA. The IEEE-SA will send
out an invitation-to-ballot to all IEEE-SA members who have
expressed an in interest in the subject, i.e. Algorithmic Bias. If the
draft receives at least 75% approval, the draft is submitted to the
IEEE-SA Standards Board Review Committee, which checks that
the proposed standard is compliant with the IEEE-SA Standards
Board Bylaws and Operations Manual. The Standards Board then
votes to approve the standard, which requires a simple majority.
At that point, about 2.5 to 3 years after the proposal for
1
Number in brackets indicate number of participants who identified as having this
expertise as part of an informal internal survey. Many participants chose not to
respond while some chose to indicate multiple expertise.
41
Authorized licensed use limited to: ANNA UNIVERSITY. Downloaded on December 01,2023 at 13:52:18 UTC from IEEE Xplore. Restrictions apply.