High-risk sociotechnical systems like nuclear installations have to be safe and secure. Whereas safety governance is aimed at preventing accidents, security deals with deliberate, unauthorized acts against nuclear facilities or materials intended to cause harm (IAEA 2008, 2010). Although security is now high on the nuclear policy agenda, the security regime for nuclear installations and infrastructures is far less developed than the safety regime. At present numerous measures are taken to enhance security, such as increasing management oversight of security, building physical barriers around nuclear organizations, and changing the attitudes and behaviors of personnel. As these measures are presently being implemented across nuclear security regimes, is it unclear how they will be enacted and how they align with measures taken in the interest of safety. To give two examples: whereas physical barriers may enhance security they could well hinder safety in the case of an emergency; a questioning attitude may at times conflict with the need for prudence and discretion. Furthermore, tensions may appear between fostering a productive innovation culture, which encourages "thinking out of the box" and the taking of risks, and safety culture, which relies on norms and rules aimed at keeping risks at a reasonably low level (Tellis et al 2007; Rollenhagen, 2010).
As scholars have studied aspects like nuclear safety, security, and innovation culture in relative isolation, there is a dearth of studies that examine how these cultures interrelate, and how they are enacted in organizational structures, principles, and through contextual variables. Hence, this PhD research project analyzes the dynamic interplay and the tensions induced between safety, security, and innovation. It conceptualizes the nature of these tensions and considers the implications of their coexistence for high-risk organizations, in particular those that are also concerned with research.
The approach developed in this PhD thesis will draw on vulnerability analysis, science and technology studies, and interpretive approaches to risk and uncertainty (Perrow 1984; La Porte 1996; Beck et al 2003; Rossignol, 2016). Contrary to classical risk analysis (focused on acquiring accurate information about hazards and creating adequate barriers), vulnerability analysis investigates the capacity of sociotechnological systems to survive, adapt and maintain their function regardless of the hazard’s likelihood. The project will build on previous research conducted within the PISA program on vulnerability-based assessment, which highlights the importance of studying "grey zones" where decisions are played out and meanings of safety and security are negotiated, irrespective of formal organizational prescriptions and policies (Rossignol 2016). The project will highlight how individuals and organizations handle difficult trade-offs between competing concerns, rather than circumvent or ignore difficult decisions (Rollenhagen 2010).
The project will draw on the following theoretical frameworks: vulnerability studies, science and technology studies, and interpretive approaches to risk and uncertainty (Perrow 1984; La Porte 1996; Beck et al. 2003; Rossignol et al, 2014; Rossignol, 2016). The focus on vulnerability, as a complement to classical risk analysis, will enable a system-oriented assessment that reveals how the organization can cope with, and adapt to, the inherent tensions between safety, security and innovation. By mobilizing a science and technology studies perspective to vulnerability analysis, the project puts emphasis on: a) drawing together technical and social aspects of vulnerability in the management of security, safety, and innovation; and b) a participatory approach that opens up the research and the discerned findings to joint inquiry. Last, the use of interpretive theories to risk and uncertainty allows for qualitative (rather than only quantitative) appraisal of these aspects, highlighting "what an organisation does" – in terms of practice, rather than "what an organisation has " – in terms of norms, rules or procedures. Consequently, the project is expected to improve risk and vulnerability theory and render organizations such as SCK•CEN more sociotechnically resilient and robust.
Beck, U., W. Bonss, and Lau, C. 2003. "The theory of reflexive modernisation: problematic, hypotheses and research programme." Theory, Culture and Society 20(2):1-33.
International Atomic Energy Agency (IAEA) 2008. Nuclear Security Culture; http://www-pub.iaea.org/MTCD/publications/PDF/Pub1472_web.pdf
International Atomic Energy Agency (IAEA) 2010. The Interface between Safety and Security at Nuclear Power Plants; http://www-pub.iaea.org/MTCD/publications/PDF/Pub1347_web.pdf
La Porte, T.R. 1996. "High reliability organizations: Unlikely, demanding and at risk." Journal of Contingencies and Crisis Management 4(2):60-71.
Perrow, C. 1984. Normal accidents: Living with high risk technologies. NewYork: Basic Books.
Tellis, G.J., Praghu, J.C., and Chandy, R.K., 2007. Innovation in Firms Across Nations: New Metrics and Drives of Radical Innovation – USC Marshall School of Business. Marshall Research Paper Series. Working Paper MKT 03-03, February 2007.
Rollenhagen, C. 2010. "Can focus on safety culture become an excuse for not rethinking design of technology?" Safety Science 48(2):268-278.
Rossignol, N., Turcanu, C., Fallon, C., and Zwetkoff, C. 2014. "How are you Vulnerable?: Using Participation for Vulnerability Analysis in Emergency Planning," Journal of Risk Research, Published online: 26 Sep 2014; http://www.tandfonline.com/doi/abs/10.1080/13669877.2014.961522
Rossignol, N. 2016. "On vulnerability and vulnerabilities of incident reporting." PhD Thesis. SCK•CEN and University of Liège.
Sujan, M. 2015. "An organisation without a memory: A qualitative study of hospital staff perceptions on reporting and organisational learning for patient safety." Reliability Engineering & System Safety 144:45-52.