The Citizen and the Automated State: Exploring the Implications of Algorithmic Decision-making in the New Zealand Public Sector
Algorithms increasingly influence how the state treats its citizens. This thesis examines how the New Zealand public sector’s use of algorithms in decision-making brings benefits, but also invites risks of discrimination, bias, intrusion into privacy and unfair decision-making. This thesis’s central conclusion is that these risks require a new response. New Zealand currently has a patchwork of existing protections which provide some deterrent against poor algorithmic decision-making. The Privacy Act 1993, Official Information Act 1982, New Zealand Bill of Rights Act 1990, Human Rights Act 1993 and applicable administrative law principles can provide remedies and correct agencies’ poor behaviour in certain cases. But important gaps remain. This thesis examines these protections to show that they do not adequately stem cumulative and systemic harms, and suffer from important practical drawbacks. They do not provide the sound preventative framework that is needed; that is, one which ensures good public sector practice. This thesis proposes a new regulatory model for public sector use of algorithms. It argues that a key element of any effective regulatory response is the use of “algorithmic impact assessments”. These assessments would mitigate potential risks, and legitimise proportionate public sector use, of algorithms. It is also proposed that an independent regulator complements these assessments by issuing guidance, undertaking algorithm audits, and ensuring political accountability through annual reporting to Parliament. Agencies would have new obligations to disclose how and when algorithms are used in decision-making. Meanwhile, citizens would gain an enhanced right to reasons for algorithmic decisions affecting them and a right to human review. Together these measures would establish a model which would safeguard responsible and effective use of algorithms in New Zealand’s public sector.