Government aims to make its ‘black box’ algorithms more transparent

Science

The government has launched an algorithmic transparency standard amid concerns that biased algorithms are impacting the way Britons are treated by the state.

Algorithms are computer systems that ingest a wide range of data to produce a single answer – but there are concerns that because people can rarely see the logical processes used to generate these answers then flaws in those processes can go missed.

These flaws can have a major impact on people’s lives. Around the world biases have been detected in algorithms used not just to decide what content to advertise to web users, but also to set insurance rates and even inform prison sentences.


Predictive algorithms: Are they a helpful tool or an invasion of privacy?

Predictive algorithms: Are they a helpful tool or an invasion of privacy?

OPINION

Last year thousands of angry students saw their A-level results downgraded by an allegedly biased algorithm introduced after COVID-19 led to exams being cancelled.

Privacy campaigners have complained that “huge databases” are being used to predict whether children are at risk of involvement in crime and to profile benefits applicants.

Earlier this year a report by Big Brother Watch on “the hidden algorithms shaping Britain’s welfare state” claimed in its conclusion: “Local authorities’ issuance of housing benefit is a laboratory for the government to test the usefulness of predictive models.

More from Science & Tech

“Our thorough investigation has led us to conclude that many predictive systems are no more than glorified poverty profiling, working to datify the long-held prejudices authorities in society hold against the poorest,” the report said.

“A common thread across all these automated systems is a lack of due attention paid by councils to the serious risks of bias and indirect discrimination,” it added.

The new algorithmic transparency standard – developed by the Cabinet Office’s Central Digital and Data Office with input from the Centre for Data Ethics and Innovation (CDEI) – follows a review into bias in algorithmic decision-making.

In its review, the CDEI recommended “that the UK government should place a mandatory transparency obligation on public sector organisations using algorithms to support significant decisions affecting individuals”.

The transparency standard will require public sector organisations to declare what algorithmic tools they are using, explain what for, and to fill out a table explaining how these algorithms work.

At its most open, the standard will see public sector organisations also share the data used to train their machine learning models and include a description of the categories used for that training, crucial steps to detecting bias.

A view of signage for the Department of Work and Pensions in Westminster, London
Image:
The DWP has been criticised for using algorithms to create ‘glorified poverty profiling’

The new standard will not immediately be used by all government departments and public sector bodies, instead being “piloted” by several of them – who have not yet been named – in the coming months.

It is not yet clear whether these transparency pilots will shed light on the most controversial uses of algorithms.

Lord Agnew, minister of state at the Cabinet Office, said: “Algorithms can be harnessed by public sector organisations to help them make fairer decisions, improve the efficiency of public services and lower the cost associated with delivery.”

“However, they must be used in decision-making processes in a way that manages risks, upholds the highest standards of transparency and accountability, and builds clear evidence of impact,” he added.

The standard was welcomed by Imogen Parker at the Ada Lovelace Institute, who added: “Meaningful transparency in the use of algorithmic tools in the public sector is an essential part of a trustworthy digital public sector.”

“We look forward to seeing trials, tests and iterations, followed by government departments and public sector bodies publishing completed standards to support modelling and development of good practice,” Ms Parker said.

Products You May Like