Treasury News Network

Learn & Share the latest News & Analysis in Corporate Treasury

  1. Home
  2. Operations
  3. Best Practices & Benchmarking

Who is responsible when analytics go wrong?

The financial function is set to rely increasingly on artificial intelligence and automation in the coming years – but whose fault is it when the machines get it wrong? According to research by KPMG, 92 per cent of senior executives are concerned about the impact on corporate reputation when data and analytics (D&A) are flawed or used inappropriately. With between 3 and 25 per cent of jobs predicted to be lost to automation and algorithm-based technology in the next seven years (figures from Gartner and PwC), and with finance one of the sectors that could benefit most from automated processes, the problem of who is responsible when analytics fail is a salient question.

Executives' mistrust in analytics

The research - which asks In a digital world, do you trust the data? - shows that 65 per cent of executives have some reservations or active mistrust in their data and analytics. With such high levels of doubt, it's interesting therefore that the study by KPMG International found that most senior executives (62 per cent) also believe that the IT department is responsible when a machine or an algorithm goes wrong – rather than responsibility lying with the C-level and functional areas. A quarter of the survey's respondents, however, thought the responsibility should like with the core business, and 13 per cent said it should be regulatory and control functions.

The report suggests that, as more companies adopt automated processes using algorithms, the issue of managing machines/systems is of increasing importance. KPMG calls for “stronger accountability at the C-level rather than with the tech functions, and proactive governance with strategic and operational controls that ensure and maintain trust.”

C-suite needs to take responsibility

KPMG's Brad Fisher commented: “Our survey of senior executives is telling us that there is a tendency to absolve the core business for decisions made with machines. This is understandable given technology’s legacy as a support service and the so-called ‘experts’ in all technical matters. However, it’s our view that many IT professionals do not have the domain knowledge or the overall capacity required to ensure trust in D&A. We believe the responsibility lies with the C-suite.”

Building trust in analytics

The survey highlighted five recommendations for building trust within an organisation:

  1. develop standards to create effective policies and procedures for all organisations;
  2. improve and adapt regulations to build confidence in D&A;
  3. increase transparency of algorithms and methodologies;
  4. create professional codes for data scientists; and
  5. strengthen internal and external assurance mechanisms that validate and identify areas of weakness.

Like this item? Get our Weekly Update newsletter. Subscribe today

Also see

Add a comment

New comment submissions are moderated.