Skip to content

News

Artificial Intelligence (AI) and Probation Services by Mike Nellis

It is no longer in science fiction alone that AI (“artificial intelligence”) figures in futuristic forms of crime control, but its real-world implications – least of all for probation services – are far from clear. This is surprising, because while AI is not exactly new, the precise technologies grouped under this loose and ambiguous rubric – driven by commercial investment and political expediency – continually increase in reach and power, surpassing what most laypeople can imagine might happen, while state and corporate actors relentlessly envision and enact new uses of it. AI’s impact on probation services may be minimal now, and its trajectory is genuinely hard to discern, but some probation services have already appointed AI specialists to their senior management teams, and there is a debate to be had about what is coming next.

Artificial Intelligence (AI) and Probation Services – Mike Nellis

Everybody knows, surely, that the emergence of AI has major implications for employment in all the so-called human service professions – reconfiguration of some kind, and increased efficiency (yes, again!)  is part of what the energising, but complacent,  narrative of the “fourth industrial revolution” promises, obliterating the losses that may be incurred en route. Probation services will not be immune from this. In Europe they have arguably been slow to acknowledge the occupational implications of AI and, if they are paying attention at all, too accommodating of talk about its inevitability, necessity and managerial “potential”. Probation’s relative silence on its likely future shape under “surveillance capitalism” (a more critical counter-narrative to that of the “fourth industrial revolution”, but by no means the final word) must end if wise decisions, viable tactics and strategic alliances are to be made and developed in respect of AI.

Electronic monitoring (EM) may be the template through which probation services are appraising the potential impact of AI as, in Europe, that has by and large been effectively constrained (so far) by the ethos and culture of probation. But that would be a mistake: AI has far more comprehensive implications for probation services than EM, not least in the size, structure and skillset of the workforce, as well as the transformation of supervisory practices themselves. That said, the steady adoption of smartphones and laptops to aid supervision, sometimes framed as a “further development” of EM, accelerated by the Covid pandemic’s constraints on face-to-face working, are a possible stepping stone towards the use of AI. In the US, the National Institute for Justice had commissioned action research on AI and smartphone-based EM before the pandemic, indicating, if nothing else, what thought leaders in corrections are thinking. Compared to humanistic forms of probation, even compared to electronic location monitoring, voice, text and camera-based smartphone supervision can generate vastly greater amounts of data about offenders’ lifestyles, dispositions and behaviour, without ever fully recognising them as “persons”.

Data is what AI systems train on, process and infer from, with a view to predicting what’s next (imminent or eventual) at varying levels of granularity – individual, institutional, corporate, electoral, demographic, societal – even genetic, molecular and viral if AI is being used to predict disease and pandemic trajectories. Predictivity – of consumer choices and market development – is the business model of “surveillance capitalism”, but has already been infused into some aspects of policing – the prediction of individual or hot-spot offending –  with mixed and ambivalent success, and is finding a place in other public services, predicting various aspects of client behaviour. Probation is no stranger to predictive technologies – risk management is mathematically premised on it – but the “online decision support systems”, rudimentary forms of AI already becoming available to probation services, with the potential to become much more if probation supplies them with more data – draw on much larger data streams, and normalise prediction (of compliance, need, risk and reoffending) as a feature of everyday practice in supervision. The specific ethics of predictive policing are already fraught  – what should be done to pre-empt imminently predicted reoffending, immediate arrest or compassionate support? – but in a world where AI is becoming ubiquitous the ethics of predictivity itself are becoming harder to question.

Since 2014, major European institutions have become preoccupied with AI, believing it essential to economic prosperity and political security, aspiring to keep abreast of developments in China (where there are no state constraints on mass data collection, or on surveillance) and the USA  (where the agenda is dominated by the interests and expertise of major tech companies). Europe, however, wants all uses of AI within its boundaries to be compliant with human rights, democratic processes and the rule of law. Reconciling human rights and the forms of human life in which they have hitherto been grounded with the intrinsically inhuman, machine-based character of AI  is a taller order than it first appears. A range of European Commission and Council of Europe committees,  sub-committees and working parties have nonetheless begun serious work on this. The implications of AI for policing and judicial decision-making were explored early,  and in 2021 the Council of Europe’s PC-CP (Penological Affairs Committee) began developing an ethical recommendation for the use of AI in both prisons and probation services, and in private companies which develop and deliver services on their behalf.

European institutions are realistically taking for granted that AI will grow in significance and affect everything. The danger here is uncritically accepting that those who devise and champion AI already have the power to make it happen, infuse it everywhere and overcome all resistance. But AI is not a natural, inexorable, evolutionary development – the forms, scale and pace of its deployment must always be contested, which requires greater “AI literacy” among those who might need to resist its encroachment. There is indeed technological ingenuity at AI’s root, but its capabilities and uses are politically and commercially driven, serving some interests and not others. AI’s rewards will, for certain, be unevenly distributed – unlike earlier waves of automation the jobs lost this time may have no replacement.  Crucially, AI – because of the resources needed to build and administer it – will increase the power of the already powerful. There are many reasons to think that it will deepen social inequality and few reasons to think that it will extend or strengthen democracy.  There are many ingrained social injustices – as probation services are all too aware – to which AI is manifestly not a  solution, which it may become harder to talk about and act against.  It is not a question of whether AI could be put to good uses – it could – but a question of whether it will, and who decides.

 

* Mike Nellis is  Emeritus Professor of Criminal and Community Justice, University of Strathclyde, honorary member of CEP and one of two advisers to the current PC-CP work on AI in prisons and probation .

Related News

Keep up to date with the latest developments, stories, and updates on probation from across Europe and beyond. Find relevant news and insights shaping the field today.

New

Domestic violence, Gender-based violence

Practitioner guidance for supporting neurodivergent clients in domestic abuse work

23/12/2025

A new practitioner guide is currently being piloted across the UK that aims to support professionals working with neurodivergent clients for more inclusive domestic abuse perpetrator interventions. The guide has been co developed for domestic abuse perpetrator intervention practitioners who work with neurodivergent clients, translating research findings into practical guidance for day to day practice.

New

Probation in Europe

New Vodcast Episode: Christoph Koss on Probation and Parole in Austria

22/12/2025

The 17th episode of Division_Y features Christoph Koss, Director of the NEUSTART Association for Probation and Parole, Restorative Justice, and Social Work in Austria.

New

Uncategorized

Newsletter December 2025 out now, featuring the 2026 CEP Activity calendar

18/12/2025

CEP’s latest newsletter is out now! Articles on the CoPPer Final Project Conference, New CEP report: The European Survey of Probation Staff’s Stress and Morale, and more.>> Read here

Check out the CEP Activity Calendar 2026

>>Read previous newsletters

Reading corner

Criminal Justice

Bridging Research and Practice in Forensic Social Work: An interview with the editors of Forensic Social Work – Supporting Desistance

17/12/2025

Supporting desistance while managing risk is at the heart of criminal justice social work across Europe. In Forensic Social Work – Supporting Desistance, editors Jacqueline Bosker, Anneke Menger and Vivienne de Vogel bring together scientific insights and everyday professional practice to support those working with justice-involved individuals. In this interview, they reflect on the motivation behind the English edition of the book, its core themes, and how professionals can use its tools and approaches in their daily work.

New

Mental Health

Why some court-ordered psychiatric patients remain in prison in Europe

15/12/2025

There is an urgent yet insufficiently recognised human-rights and public-health crisis unfolding across Europe: the systematic imprisonment of mentally ill individuals who have already been assessed by courts or psychiatric professionals as requiring treatment in secure psychiatric hospitals rather than confinement in correctional facilities. Evidence indicates that structural failings—including bed shortages, procedural delays, and fragmented legal and administrative frameworks—have produced a situation in which thousands of vulnerable individuals remain in prison in direct contravention of judicial orders, clinical assessments, and international human-rights obligations. This constitutes a largely invisible mental-health scandal, obscured by inconsistent data collection, political sensitivities, and the general invisibility of people in custody.

New

Partners

Memorandum of Understanding Signed Between CEP and RESCALED

11/12/2025

On 10 December 2025, at the CEP Headquarters in Utrecht, CEP and RESCALED signed a Memorandum of Understanding (MoU) aimed at strengthening cooperation in areas of mutual interest. The MoU was signed by Rogier Elshout, Chair of Rescaled and Jana Špero Kamenjarin, CEP Secretary General, with the signing ceremony attended by Helen De Vos, Rescaled Executive Director and Daniel Danglades, CEP Vice-President.

Through this partnership, both organizations will work together to develop joint activities, exchange expertise, and support initiatives that advance their shared objectives.

This MoU reflects a commitment to transparent communication and the creation of new opportunities for joint projects and broader community impact.

Subscribe to our bi-monthly email newsletter!