WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO
April 28 2026
WE ARE A MAGAZINE ABOUT LAW AND JUSTICE | AND THE DIFFERENCE BETWEEN THE TWO

Met using Palantir AI to root out corrupt officers

Met using Palantir AI to root out corrupt officers

A group representing frontline officers has accused the Metropolitan Police of ‘Big Brother’ tactics over its use of AI pioneered by controversial tech company Palantir to ‘root out rogue cops’. According to the Guardian over the weekend, the force had deployed the software which revealed corruption as the most consistent offence.

As a result, some 98 officers are being assessed for misconduct related to ‘abuse of the IT system that rosters shifts by police officers for personal or financial gain’; another 500 have received prevention notices in relation to the same offence; and 42 senior officers are being ‘assessed for misconduct for serious noncompliance’ for falsely claiming to have been in the office when they had been working from home or away. The software also revealed 12 officers who failed to declare that they were Freemasons and 30 officers have received prevention notices for suspected  membership.

Palantir was founded by the US billionaire Peter Thiel in 2003 and has attracted controversy over its contracts with the Trump administration, the Israeli military as well as concerns about its incursion into the NHS.  The Metropolitan Police Federation, representing more than 30,000 frontline police officers in London, yesterday warned members to be ‘extremely cautious about carrying Metropolitan Police issued devices when off duty’. It warned that the ‘automated suspicion’ being placed on officers by the force will cause ‘significant damage to morale and trust’. The group is looking at legal action against the Met over the rights of officers to have private life under Article 8 of the Human Rights Act.

The London mayor Sadiq Kahn could block the Met’s plans. The Telegraph reported that Khan  issued a warning over public money being spent with firms that ‘act contrary to London’s values’.  In February, the police force signed a £489,999 deal with Palantir. City Hall confirmed that the Mayor’s Office ‘must scrutinise and sign off on any police contracts worth more than £500,000’. A spokesman said: ‘We can’t comment on live procurement processes. However, as a general point the Mayor would have concerns about using public money to support firms who act contrary to London’s values.’

The human rights group Liberty last year warned of ‘dystopian predictive policing’ and ‘indiscriminate mass surveillance’ as a result of the police adopting Palantir’s software in its policing. The group claimed numerous forces refusing to confirm or deny links with the company ‘citing risks to law enforcement and national security’ – Bedfordshire and Leicestershire confirmed they were working with the firm. David Nolan, a senior investigative researcher at Amnesty, said: ‘The establishment and provision of data-driven law enforcement and predictive policing technologies by companies such as Palantir […] raises severe human rights concerns, particularly given such companies have a history of blatant contempt for human rights. Technologies used for ‘crime prediction’ must be banned.’

Matt Cane, general secretary of the Metropolitan Police Federation, said that he was aware of Met’s intention to upgrade software but not that it included software from Palantir. Cane said its ‘continuous 24/7 geo-location tracking’ was ‘highly intrusive and risks monitoring officers when they are off duty, on rest days, or at home’. ‘This presumption of wrongdoing and attack on officer’s personal lives is unacceptable,’ he added.

‘Courageous colleagues across London do not deserve to be treated with this level of suspicion by their Big Brother Bosses,’ said Cane. ‘Police officers – like all people – have a right to a private life. Where is the transparency on this purge and the reassurance that the correct checks and balances are there on such a significant move? This use of AI will seriously damage the trust Metropolitan Police officers have in the force and ride a coach and horses through already plummeting morale.’ He described the use of AI ‘to spy on our officers’ as ‘not proportionate, just or proper. It’s an outrageous and unforgivable invasion of privacy.’