To celebrate Right to Know 2020, Information Governance ANZ were delighted to host a timely discussion on the right to access information and the use of algorithms in government decision-making.
This interactive forum was facilitated by Susan Bennett, Founder of InfoGovANZ and our special guests included:
- NSW Information Commissioner – Elizabeth Tydd
- Victorian Information Commissioner - Sven Bluemmel
- Senior Research Fellow, University of Cambridge – Dr Jat Singh
The increasing adoption of technology across society, including in government, requires the preservation, assurance and assertion of information access rights. The right of access to government information also extends to information held by contractors that provide services to the public on behalf of government.
Applying and challenging these rights becomes more complex in the new world of automated decision-making. Legal frameworks, licensing and contracts must evolve to ensure information governance is applied to the design and use of algorithms, so that rights and interests are protected.
Commissioner Sven Bluemmel on building trust through transparency
Commissioner Bluemmel explained that Right To Know Day is actually an global event, recognised as International Access to Information Day (IAID). It aims to raise awareness of the importance of open government, and the public’s right to access government-held information.
We consider this right as the foundation for transparent and accountable government in a democratic society, supporting public participation and scrutiny of government decisions, and by extension supporting better decision-making.
As the public sector is increasingly making use of technology such as artificial intelligence (AI) to provide services and make decisions, there are questions about how the principles of good governance can best be applied. OVIC published an e-book called “Closer to the Machine”, exploring the technical, social and legal aspects of AI, including algorithmic transparency.
The Commissioner noted the importance of trust to the relationship between government and citizens. The relationship people have with government is very different from their relationship with a business or other kind of service provider where an algorithm could be considered the “secret sauce”.
Government decisions impact people’s livelihoods, their freedoms and human rights. It’s one thing if an algorithm incorrectly predicts my favourite songs or shopping preferences. It’s quite another if I’m denied a pension or parole based on an automated decision.
So government has to take a very different approach to the application of AI and similar technologies. It has to ensure transparency and accountability are fully supported so that decisions can be explained, challenged and, most importantly, trusted.
This aspect of trust and trusted information is especially important during times of crisis. “Forcing” people to do the right thing is far less effective than having a trusted relationship with constituents. There has to be trust in the government’s motives as well as its mechanisms. In this situation, reliance on defensible data and explainable modelling comes more into play.
Commissioner Elizabeth Tydd on the shift to digital government
Commissioner Tydd mentioned some findings from the IPC Community Attitudes Survey. There is growing support (72%) for government use of de-identified data to plan and deliver public services. There is also strong demand for accountability and transparency in the collection and use of data with 78-81% support for more public reporting. Overall, 88% of respondents said their right to access government information was important.
People are becoming interested in a wider range of information, not only access to their personal information or information about government services. They also seek access to information about how policy is developed and how agencies operate. This has implications as government makes use of data and technology in new and different ways. Administrative practices must evolve to safeguard the right of access to information.
The Commissioner highlighted several case examples challenging algorithmic transparency in government decision-making and asked: Is the concept of access to information sufficiently “malleable” and “future proof”?
In one case, a social housing tenant challenged the calculation of their rental subsidy, which had been determined using an algorithm. It was difficult for the agency to respond to the citizen’s request for information about the decision-making process. Procurement of the system was below the threshold for public reporting on government contracts, and the developer claimed that details of the algorithm were commercial-in-confidence.
In NSW, the GIPA Act applies beyond government agencies to include certain information held by other organisations contracted to provide a public service. But the way this applies to automated decision making is still being tested.
The Commissioner noted that access to information is an enabling right – it enables a citizen to assert other rights. If access to information is restricted it can have much wider implications. She stressed that “accessibility” means information must be understandable on every level. It’s not sufficient to simply release the source code or require the expertise of a data scientist. Access must also come at minimal cost to the applicant.
In another case study, the Gradient Institute considered an AI-driven system for financial approvals, based on credit history and past transactions, which unfairly discriminated against female applicants. Historical data can introduce bias and AI predictions may become less accurate as society changes over time – leading to more perverse outcomes.
An applicant has the right to understand how a decision was made, but there are also sound business reasons for explainability and algorithmic transparency. The design phase is key. Organisations must ensure their processes are auditable, to understand how learning and recalibration happen.
POLL: Who is using machine enhanced or automated decisions – a high proportion “don’t know”. Does this suggest that information governance practitioners need to get more involved with AI and machine learning projects?
Dr Jat Singh on how automated decision-making works
Dr Singh noted that automated decision-making sits at the intersection of law, policy, community and technology. He mentioned the difference between automated decision-making and enhanced decision-making, where machine learning uses data to determine probability or make predictions that support human decisions. It is a policy decision, not a technical decision, to determine what kinds of data use are appropriate.
A recent example of enhanced decision-making involved COMPAS software which uses algorithms to assess potential risks of recidivism. Judges may use these predictions to inform their decisions when detaining or sentencing a defendant. The algorithms remain a trade secret and this lack of transparency could cause a breach of due process. The case study also raised concerns about machine bias resulting from training data or input data, rather than the algorithm itself.
Dr Singh emphasised that accountability relies on explainability, but it can be challenging to explain how an algorithm works and make it transparent. Each stage of system design involves embedding norms, potential bias and assumptions – not just the training data. There is also a risk of “model skew” over time. Developers must consider whether a model is still fit for purpose in relation to the data and as society changes. Whether the model needs retraining for a business or other purpose, auditability is essential.
The key is to look more broadly at the system as a whole and identify information that can be made available. For example: information about the underlying data used by the algorithm; its source code and performance or test criteria; the software on which it is built; the process steps or workflow for a decision; and how it is deployed – whether a decision automatically triggers an action, or simply provides advice to a decision-maker. This means record keeping is required at all stages, to answer questions that will arise.
Releasing such information enables people to oversee, interrogate and challenge the validity of the system, as well as the validity of a particular decision. It also helps to improve the accuracy and fairness of algorithms. To be auditable, systems must be able to expose the relevant information in real time and this capability should be clearly defined from the outset.
POLL: Who has received an application for access to an algorithm or information about an automated decision – only 13% of webinar participants.
Many citizens don’t yet know they can ask for this kind of information, but this is changing. NSW IPC have published a fact sheet for citizens and another fact sheet for agencies to raise awareness about automated decision-making and information access rights.
Susan Bennett on the multidisciplinary approach to governance
Susan emphasised that it is important to establish the purpose of new technology, how it aligns with the goals of the organisation and understand who is the intended beneficiary.
She observed the growing interest also in privacy and that effective assessment of ethical data use and privacy impacts requires a multidisciplinary approach. Boards and senior executives must consider whether their organisations have the right governance structures in place to deal with these issues.
Procurement processes have a critical role in achieving privacy “by design” approach, ensuring that licences and contracts include the provisions to meet information access obligations. She pointed out that the “right to audit” should be included in contracts, not just in relation to data breaches but in order to confirm a solution is working as intended.
Organisations must also continue to invest in the fitness of their systems as data, technology and society change.
If you would like to listen to the full discussion you can access a recording of the session here.
Author
Sonya Sherman is a member of the Information Governance ANZ Advisory Board and Principal at Zen Information.