Recently, efforts to regulate software and make organizations and individuals more accountable for its consequences have increased. Traditionally, the human-in-the-loop (operator, user, or bystander) is usually blamed for undesirable behavior of software systems in the real world. This is due to the limitations of the user-centered design approach where an average user's mental model (MM) is adopted. The core belief in this paper is that user-centered design must incorporate a wider lens of stakeholder interactions using socio-technical ecosystems being inclusive of users from various backgrounds and consulting with certifiers, manufacturers, and regulatory agencies for a given jurisdiction. We envision a socio-technical co-design approach for the development of compliant autonomous socio-technical systems (ASTS), which can infuse novel interpretations of regulations based on the social, behavioral, and economic (SBE) background of users. We posit that an accountable software has three properties: 1) operational transparency: Amenable to monitoring relevant parameters for tacit knowledge (user's MM of the ASTS), 2) operational adaptability: The software can be configured to support evaluation of regulatory compliance with changing performance expectations and compliance perceptions, and 3) operational interpretability: The software can assist in generating feedback for guidance on the compliance properties of novel modes-of-operations - a consequence of dissonance between MM of the user and the system designer's view of users' MM.
|Original language||English (US)|
|Journal||CEUR Workshop Proceedings|
|State||Published - 2021|
|Event||2021 Workshop on Artificial Intelligence Safety, AISafety 2021 - Virtual, Online|
Duration: Aug 19 2021 → Aug 20 2021
ASJC Scopus subject areas
- Computer Science(all)