Automated decision-making vs indirect discrimination : Solution or aggravation?

The usage of automated decision making-systems by public institutions letting the system decide on the approval, determination or denial of individuals benefits as an example, is an effective measure in making more amount of work done in a shorter time period and to a lower cost than if it would hav...

Full description

Bibliographic Details
Main Author: Lundberg, Emma
Format: Others
Language:English
Published: Umeå universitet, Juridiska institutionen 2019
Subjects:
Law
Online Access:http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-161110
id ndltd-UPSALLA1-oai-DiVA.org-umu-161110
record_format oai_dc
spelling ndltd-UPSALLA1-oai-DiVA.org-umu-1611102019-10-17T22:12:21ZAutomated decision-making vs indirect discrimination : Solution or aggravation?engLundberg, EmmaUmeå universitet, Juridiska institutionen2019automated decision-makingdiscriminationbiasLawJuridikThe usage of automated decision making-systems by public institutions letting the system decide on the approval, determination or denial of individuals benefits as an example, is an effective measure in making more amount of work done in a shorter time period and to a lower cost than if it would have been done by humans. But still, although the technology has developed into being able to help us in this way, so has also the potential problems that these systems can cause while they are operating. The ones primarily affected here will be the individuals that are denied their benefits, health care, or pensions. The systems can maintain hidden, historical stigmatizations and prejudices, disproportionally affecting members of a certain historically marginalized group in a negative way through its decisions, simply because the systems have learned to do so. There is also a risk that the actual programmer includes her or his own bias, as well as incorrect translation of applicable legislations or policies causing the finalized system to make decisions on unknown bases, demanding more, less or completely other things than those requirements that are set up by the public and written laws. The language in which these systems works are in mathematical algorithms, which most ordinary individuals, public employees or courts will not understand. If suspecting that you could have been discriminated against by an automated decision, the requirements for successfully claim a violation of discrimination in US-, Canadian- and Swedish courts, ECtHR and ECJ demands you to show on which of your characteristics you were discriminated, and in comparison to which other group, a group that instead has been advantaged. Still, without any reasons or explanations to why the decision has been taken available for you as an applicant or for the court responsible, the inability to identify such comparator can lead to several cases of actual indirect discriminations being denied. A solution to this could be to follow the advice of Sophia Moreau’s theory, focusing on the actual harm that the individual claim to have suffered instead of on categorizing her or him due to certain traits, or on finding a suitable comparator. This is similar to a ruling of the Swedish Court of Appeal, where a comparator was not necessary in order to establish that the applicant had been indirectly discriminated by a public institution. Instead, the biggest focus in this case was on the harm that the applicant claimed to have suffered, and then on investigating whether this difference in treatment could be objectively justified. In order for Swedish and European legislation to be able to meet the challenges that can arise through the usage of automated decision making-systems, this model of the Swedish Court of Appeal could be a better suited model to help individuals being affected by an automated decision of a public institution, being potentially indirectly discriminative. Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-161110application/pdfinfo:eu-repo/semantics/openAccess
collection NDLTD
language English
format Others
sources NDLTD
topic automated decision-making
discrimination
bias
Law
Juridik
spellingShingle automated decision-making
discrimination
bias
Law
Juridik
Lundberg, Emma
Automated decision-making vs indirect discrimination : Solution or aggravation?
description The usage of automated decision making-systems by public institutions letting the system decide on the approval, determination or denial of individuals benefits as an example, is an effective measure in making more amount of work done in a shorter time period and to a lower cost than if it would have been done by humans. But still, although the technology has developed into being able to help us in this way, so has also the potential problems that these systems can cause while they are operating. The ones primarily affected here will be the individuals that are denied their benefits, health care, or pensions. The systems can maintain hidden, historical stigmatizations and prejudices, disproportionally affecting members of a certain historically marginalized group in a negative way through its decisions, simply because the systems have learned to do so. There is also a risk that the actual programmer includes her or his own bias, as well as incorrect translation of applicable legislations or policies causing the finalized system to make decisions on unknown bases, demanding more, less or completely other things than those requirements that are set up by the public and written laws. The language in which these systems works are in mathematical algorithms, which most ordinary individuals, public employees or courts will not understand. If suspecting that you could have been discriminated against by an automated decision, the requirements for successfully claim a violation of discrimination in US-, Canadian- and Swedish courts, ECtHR and ECJ demands you to show on which of your characteristics you were discriminated, and in comparison to which other group, a group that instead has been advantaged. Still, without any reasons or explanations to why the decision has been taken available for you as an applicant or for the court responsible, the inability to identify such comparator can lead to several cases of actual indirect discriminations being denied. A solution to this could be to follow the advice of Sophia Moreau’s theory, focusing on the actual harm that the individual claim to have suffered instead of on categorizing her or him due to certain traits, or on finding a suitable comparator. This is similar to a ruling of the Swedish Court of Appeal, where a comparator was not necessary in order to establish that the applicant had been indirectly discriminated by a public institution. Instead, the biggest focus in this case was on the harm that the applicant claimed to have suffered, and then on investigating whether this difference in treatment could be objectively justified. In order for Swedish and European legislation to be able to meet the challenges that can arise through the usage of automated decision making-systems, this model of the Swedish Court of Appeal could be a better suited model to help individuals being affected by an automated decision of a public institution, being potentially indirectly discriminative.
author Lundberg, Emma
author_facet Lundberg, Emma
author_sort Lundberg, Emma
title Automated decision-making vs indirect discrimination : Solution or aggravation?
title_short Automated decision-making vs indirect discrimination : Solution or aggravation?
title_full Automated decision-making vs indirect discrimination : Solution or aggravation?
title_fullStr Automated decision-making vs indirect discrimination : Solution or aggravation?
title_full_unstemmed Automated decision-making vs indirect discrimination : Solution or aggravation?
title_sort automated decision-making vs indirect discrimination : solution or aggravation?
publisher Umeå universitet, Juridiska institutionen
publishDate 2019
url http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-161110
work_keys_str_mv AT lundbergemma automateddecisionmakingvsindirectdiscriminationsolutionoraggravation
_version_ 1719269878869786624