Wave upon wave of digital technologies and the fast-advancing frontier of artificial intelligence (AI) are disrupting well-established, institutionalized ways of doing things, resulting in a host of ethical, regulatory, and policy challenges. The challenge involves addressing social, ecological, moral, and economic risks associated with technological advancements while, at the same time, encouraging competitiveness, innovation, and value generation from ubiquitous digital data. To address current ethical challenges and risks and anticipate future issues related to emerging technologies, governments are seeking to adjust regulations and set policy. Organizations are seeking to adapt their governance schemes to regulatory and ethical issues. This special issue seeks sociotechnical and information systems contributions to our thinking on ethics, regulation, and policy around digital technologies, data, and AI.
Ethical tensions abound in relation to the pervasiveness of digital data and the widespread adoption of AI, including tensions between data privacy, fairness, and bias on the one hand, and business needs for using data for market and innovation purposes on the other. AI systems that increasingly replace cognitive tasks previously done by humans raise questions about deskilling and labor substitution and ultimately human dignity. While AI systems promise productivity gains and the potential to address key social and environmental issues, they also generate significant threats to sustainability and social challenges. Moreover, the responsibility of algorithms for the tasks they perform and the decisions they make is often not clear. The autonomy, interconnectedness, and increasing agency of digital technologies present challenges to established individualistic responsibility approaches since they make it difficult to attribute responsibility.
The tension between addressing ethical risks and encouraging innovation requires balancing data privacy, fairness, and bias concerns with business needs for competitiveness and value generation from digital data and AI systems. Ethical challenges have inspired regulatory and policy action around digital technologies, particularly with respect to data and AI. The United States National Artificial Intelligence Act of 2020 sets forth programs and activities and highlights values like trustworthiness; at the same time, it aims at creating competitive advantage for the US economy by aiming at creating an innovation ecosystem or fostering public-private partnership mechanisms. The European Union's AI Act embeds fundamental values as it seeks to accomplish "a high level of protection of health, safety, fundamental rights" and defines requirements for transparency monitoring, market surveillance, governance that provide legitimacy for the use of AI systems. In addition, it seeks to promote innovation by means of regulatory sandboxes to foster innovation. The tension between data privacy and use, for instance, is reflected in the General Data Protection Regulation which builds on privacy as a fundamental value.
Regulatory responses involve institutional construction and ongoing institutional change, often requiring the deliberate coordination and action of change agents at different levels, including political, corporate, or non-governmental individuals and groups. These actors need to make sense of emerging technologies and their prospects to create policy and to regulate them. Rapid changes in the sociotechnical context fueled by the decentralized and immaterial nature of digital technologies require frequent updates and often render conventional approaches to setting regulation and policy slow and ineffective. This development has given rise to a move from established frameworks that focus on sanctioning undesired behaviors through stable, well-established structures, to regulatory and policy-focused frameworks that allow for more frequent updates and results in novel approaches to ensuring ethical technological advancement.
Digital technologies serve a dual role—they are both the source of ethical concerns triggering regulatory change and a tool for implementing, monitoring, and updating policies and regulations, distinguishing "regulation of technology" from "regulation through technology." Other regulatory approaches include market regulation through contracts, as well as self-regulation, the "practice of industry taking the initiative to formulate and enforce rules and codes of conduct." Such soft-law approaches consist of "non-binding norms of action, process, and behaviour, for whom sanctions of the formal regulatory type play no part." Beyond merely complying with regulations, many organizations are seeking to internalize ethical approaches to the governance of powerful digital technologies. Rules can even be implemented in software and thus allow for new forms of algorithmic regulation, highlighting how technology can be a carrier of institutional elements.
Clearly, digital technology is not only an external agent that raises ethical concerns and triggers regulatory change. As a tool and agent, digital technologies can serve to implement, monitor, and update policies and regulations. Such tools help regulatory institutions achieve legitimacy and stability—for instance, when "regtech" monitors the implementation of a regulation, when smart contracts prevent post-hoc changes, or when a system's architecture prevents certain uses or nudges users towards preferred uses. Technology is not only the source of unanticipated challenges, but can also be part of the solution for regulatory problems, which marks the distinction between "regulation of technology" and "regulation through technology."
Ethical challenges as well as policy and regulatory change require organizational efforts to adopt and comply with new regulations and policies. This is challenging, since it requires organizational actors to make sense of and interpret the often-complex changes in the regulatory institutional environment and demands changes in organizational practices as well as technology architecture and implementation. For instance, the General Data Protection Regulation and subsequent European data regulations have had profound organizational impacts related to the implementation of technical solutions, adjustments of organizational structure, and strategic considerations about minimal compliance and full adoption. Further, organizations are not necessarily passive recipients of regulatory change and new policies, but they respond to regulatory pressures in different ways and actively take positions to influence and shape regulations and policy.
The scholarly information systems community provides rich theoretical perspectives to study the institutional challenges around ethics, regulation, and policy. Specifically, the information systems field provides a sociotechnical foundation for understanding the ethical implications of digital technologies. Research is pointing to a host of ethical issues relating to contemporary digital technologies such as AI systems and information systems research is developing approaches for tackling these thorny issues. Organizations are increasingly being held accountable for ethical issues, and the field provides a foundation in organizational governance and in design of technology that can inform research into these issues. Further, the field has developed a rich body of literature regarding how digital technologies are implicated in institutional change, in general, including on how institutional elements, material elements, and enacted practices shape each other in various ways. Researchers have paid increasing attention to how these practices are related to the broader institutional context at the field-level, and recent work has focused on regulatory institutions in particular.
Technology regulation and policy seeks to allow for innovation while mitigating the risk associated with those technologies. This special issue seeks sociotechnical research that draws on information systems insights to understand how ethical, regulatory, and policy issues evolve and are addressed; their impact on innovation and sustainability; effective regulatory design and governance; compliant technology design; and organizational adaptation under economic, ethical, environmental, and social considerations.