
The Information Commissioner’s Office has given approval to companies to use automated hiring processes. There is a caveat. They must review their processes to ensure they have the right protections in place. The announcement comes as the ICO released a new report entitled: Recruitment rewired: an update on the ICO’s work on the fair and responsible use of automation in recruitment.The ICO has also updated its draft guidance on the use of automated decision-making (ADM), including profiling. It has been updated to reflect changes to the UK GDPR after the introduction of the Data (Use and Access) Act 2025 (DUAA).

William Malcolm, Executive Director for Regulatory Risk and Innovation at the ICO, said: “Use of AI and automation is rapidly transforming recruitment across the UK – from helping sift CVs to scoring online assessments.
“We want to support organisations to take advantage of both recent changes to the law and these new tools. But responsible innovation and adoption of this new technology require safeguards to be in place to protect jobseekers – which are foundational to public trust.”
Different approaches reveal how AI is used
This is the result of a nine-month study that started in March 2025 with 30 organisations. It has allowed the ICO to gather evidence on how employers are using automation in the recruitment process.
Organisations are using automation in different ways. Some are still in early phases of pilots and limited use cases. Others are looking at more sophisticated tools.
Some of the initial employers spoken to have now stopped using automated recruitment. They say it is no longer needed. There is no data on why they made that choice.
Where automation is used, the tools used, and for what roles, also differ. Nobody is using automation universally across their recruitment processes. Some use automated recruitment to filter applications, especially when they have had a big response to a vacancy.
This includes scoring and ranking based on data in the CV. They are also using AI-powered behaviour games and psychometric assessments. These are reasonable evolutions of existing processes.
AI is also working deeper into the process. It is being used to score written responses to questions and transcripts of interviews.
There is also the use of AI for sentiment and emotional analysis. It is assessing the language, tone and content of a candidate’s speech to predict personality types. These are techniques used by banks and some contact centres already. The question is whether it is suitable for a high-pressure environment such as a job interview.
Whether a decision is solely ADM also needs to be carefully defined. If there is any human control in the end result, the ICO says it is not ADM. Organisations need processes to show who makes that final determination and create potential review processes when it is challenged.
Concerns from candidates
The report does not talk to candidates who have been through the process. However, the last year has seen multiple press articles and posts across social media addressing this issue. For most candidates, the impersonal nature of the rejection is an issue, as is the why. Will this lead to an increase in Subject Access Requests? The report does not say.
There is a section on transparency and safeguards that needs careful study. There are provisions under the UK GDPR, specifically 13, 14 and 15, that must be addressed. Additionally, the ADM safeguards have a provision 22C which also addresses how organisations must inform people of decisions taken and why.
How far will employers go in providing that information? The computer says “no”, is not going to work. Neither will the approach of financial institutions that won’t provide data because it would reveal how they screen for fraud. How will they verify the trustworthiness of an individual in the recruitment process?
Bias is an area that concerns everyone, and that is addressed, in part, in this report. The use of sentiment analysis will be of concern when it comes to racial bias. There is a long list of academic research on the subject, such as this one from researchers at the University of South Carolina.
It is not just bias. People react to interview processes in different ways. Some get excessively nervous, and that can lead to challenges in how they are perceived. An experienced recruitment team can sift through that. But there is little evidence from commercial recruitment solutions that they have a way to allow for those candidates.
With ADM, that could easily lead to candidates being rejected for no other reason than struggling in an interview, as perceived by the AI.
Enterprise Times: What does this mean
Like other areas where ADM is occurring, organisations see AI as a way to reduce costs. But this is a one-sided approach. It will be interesting to see how much work is done by the ICO to regulate this market. This report is a good starting point, but there is much to be done beyond it.
Nobody wants more compliance controls, but for ADM, there must be significant checks and balances. When that ADM affects the lives of people, there must be transparency and accountability. That is where legislation comes in.
The ICO believes it has found a way to help organisations navigate this, and a nine-month evidence-gathering phase is a good start. However, it needs much greater data input from a far wider set of organisations to deliver meaningful guidelines.
The ICO Enabling Business in the UK Economy shows that it believes it has generated £233 million in economic value for UK businesses over five years. For many, the unanswered question is what has it generated for individuals? It is, after all, supposed to act as a safeguard for both sides.
The post ICO Approves the Use of Automated Hiring appeared first on Enterprise Times.
Discover more from RSS Feeds Cloud
Subscribe to get the latest posts sent to your email.
