The Home Office has faced stark criticism from campaign groups for its decision not to reveal details of the algorithm used to filter UK visa applications.

In response to a legal challenge brought by immigrants’ rights campaigners over use of its artificial intelligence programme, the Home Office provided a redacted list of countries in different categories of “risk”, which had been entirely blacked out.

The algorithm is understood to stream and sort applications for UK visas by applicant nationality, with some countries deemed ‘high risk’.

The concerns centre on high-risk nationalities being easy to remove, which in turn creates a bias during the algorithm streaming process.

The Home Office, however, has refused to reveal which nations have been deemed a risk within the algorithm.

The Home Office has acknowledged that applications are decided within the visa processing system on the basis of nationality but insisted its AI system is fully compliant with UK equality laws.

Campaign groups, however, continue to express concerns that the Home Office systems and data on which the algorithms operate are error-prone, resulting in discrimination and flawed decision-making.

As Editor of Lawble, Gill helps business and individuals become better informed about their legal rights. Gill is a content specialist in the fields of law, tax and human resources.