Att styra säkerhet med siffror : En essä om (att se) gränser
Work, especially that in complex, dynamic workplaces, often requires subtle, local judgment with regard to timing of subtasks, relevance, importance, prioritization and so forth. Still, people in Nuclear Industry seem to think safety results from people just following procedures. In the wake of fail...
Main Author: | |
---|---|
Format: | Others |
Language: | Swedish |
Published: |
Linnéuniversitetet, Institutionen för fysik och elektroteknik (IFE)
2015
|
Subjects: | |
Online Access: | http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-44940 |
id |
ndltd-UPSALLA1-oai-DiVA.org-lnu-44940 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-UPSALLA1-oai-DiVA.org-lnu-449402015-06-30T04:53:31ZAtt styra säkerhet med siffror : En essä om (att se) gränser sweEngström, DianaLinnéuniversitetet, Institutionen för fysik och elektroteknik (IFE)2015säkerhetskulturhuman performance toolsCAPCorrective action programthe black swanWork, especially that in complex, dynamic workplaces, often requires subtle, local judgment with regard to timing of subtasks, relevance, importance, prioritization and so forth. Still, people in Nuclear Industry seem to think safety results from people just following procedures. In the wake of failure it can be tempting to introduce new procedures and an even stricter "rule following culture". None, or at least very little, attention is given to tacit knowledge and individual skills. I am aiming to highlight the inadequacy of putting too much trust in formalization and that reporting and trending of events will contribute to increased learning, an increased nuclear safety and an efficient operational experience. The ability to interpret a situation concrete depends on proven experience in similar situations, analogical thinking and tacit knowledge. In this essay I intend to problematize the introduction and use of so-called Corrective Action Program (CAP) and computerized reporting systems linked to CAP in the Nuclear Industry. What I found out is that the whole industry, from regulators to licensees, seems to be stuck in the idea that the scientific perspective on knowledge is the only "true" perspective. This leads to an exaggerated belief in that technology and formalized work processes and routines will create a safer business. The computerized reporting system will not, as the idea was from the beginning, contribute to increased nuclear safety since the reports is based on the trigger and not the underlying causes and in-depth analysis. Managing safety by numbers (incidents, error counts, safety threats, and safety culture indicators) is very practical but has its limitations. Error counts only uphold an illusion of rationality and control, but may offer neither real insight nor productive routes for progress on safety. The question is why the CAP, error counts and computerized reporting systems have had such a big impact in the nuclear industry? It rests after all, on too weak foundations. The answer is that the scientific perspective on knowledge is the dominating perspective. What people do not understand is that an excessive use of computerized systems and an increased formalization actually will create new risks when people lose their skills and ability to reflect and put more trust in the system than in themselves. Student thesisinfo:eu-repo/semantics/bachelorThesistexthttp://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-44940application/pdfinfo:eu-repo/semantics/openAccess |
collection |
NDLTD |
language |
Swedish |
format |
Others
|
sources |
NDLTD |
topic |
säkerhetskultur human performance tools CAP Corrective action program the black swan |
spellingShingle |
säkerhetskultur human performance tools CAP Corrective action program the black swan Engström, Diana Att styra säkerhet med siffror : En essä om (att se) gränser |
description |
Work, especially that in complex, dynamic workplaces, often requires subtle, local judgment with regard to timing of subtasks, relevance, importance, prioritization and so forth. Still, people in Nuclear Industry seem to think safety results from people just following procedures. In the wake of failure it can be tempting to introduce new procedures and an even stricter "rule following culture". None, or at least very little, attention is given to tacit knowledge and individual skills. I am aiming to highlight the inadequacy of putting too much trust in formalization and that reporting and trending of events will contribute to increased learning, an increased nuclear safety and an efficient operational experience. The ability to interpret a situation concrete depends on proven experience in similar situations, analogical thinking and tacit knowledge. In this essay I intend to problematize the introduction and use of so-called Corrective Action Program (CAP) and computerized reporting systems linked to CAP in the Nuclear Industry. What I found out is that the whole industry, from regulators to licensees, seems to be stuck in the idea that the scientific perspective on knowledge is the only "true" perspective. This leads to an exaggerated belief in that technology and formalized work processes and routines will create a safer business. The computerized reporting system will not, as the idea was from the beginning, contribute to increased nuclear safety since the reports is based on the trigger and not the underlying causes and in-depth analysis. Managing safety by numbers (incidents, error counts, safety threats, and safety culture indicators) is very practical but has its limitations. Error counts only uphold an illusion of rationality and control, but may offer neither real insight nor productive routes for progress on safety. The question is why the CAP, error counts and computerized reporting systems have had such a big impact in the nuclear industry? It rests after all, on too weak foundations. The answer is that the scientific perspective on knowledge is the dominating perspective. What people do not understand is that an excessive use of computerized systems and an increased formalization actually will create new risks when people lose their skills and ability to reflect and put more trust in the system than in themselves. |
author |
Engström, Diana |
author_facet |
Engström, Diana |
author_sort |
Engström, Diana |
title |
Att styra säkerhet med siffror : En essä om (att se) gränser |
title_short |
Att styra säkerhet med siffror : En essä om (att se) gränser |
title_full |
Att styra säkerhet med siffror : En essä om (att se) gränser |
title_fullStr |
Att styra säkerhet med siffror : En essä om (att se) gränser |
title_full_unstemmed |
Att styra säkerhet med siffror : En essä om (att se) gränser |
title_sort |
att styra säkerhet med siffror : en essä om (att se) gränser |
publisher |
Linnéuniversitetet, Institutionen för fysik och elektroteknik (IFE) |
publishDate |
2015 |
url |
http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-44940 |
work_keys_str_mv |
AT engstromdiana attstyrasakerhetmedsiffrorenessaomattsegranser |
_version_ |
1716806744498241536 |