Save content
Have you found this content useful? Use the button above to save it to your profile.
British houses of Parliament and Big Ben. London, England.
istock_RichVintage

HMRC, DWP and PO struggle with complexity and fact

by

Recent mistakes by HMRC, DWP and the Post Office have had significant consequences. Bill Mew investigates systems of record reliability, and how it could get worse.

18th May 2021
Save content
Have you found this content useful? Use the button above to save it to your profile.

A series of recent mistakes by major government departments have had significant consequences. HMRC has been pursuing taxpayers for penalty notices that they have not received, DWP is having a problem with duplicate records and dozens of Post Masters have even been jailed on the basis of faulty evidence. 

Accountancy is based on the assumption that there is a single authoritative reference point from which a single version of the truth can easily be derived. On top of this, there should be a log of entries and changes to provide a level of accountability – a system of record.

Definition

While the term ‘system of record’ has been confused or abused to mean any legacy system, it actually refers to a system where original data is entered or recorded, and that provides an authoritative reference point for other systems.

 Historically, written ledgers were systems of record. Once entries were made, they then became a matter of record. Restrictions were made as to who could make entries into such ledgers and those doing so would be held to account for mistaken data entries or calculations.

Systems of record should also not be confused with systems of engagement. These are the systems that interpret and represent the data as insights and analysis in dashboards, sales support systems, or management information systems. 

On this basis, while ledgers and accounting records would be systems of record, the tax systems that use such data to derive tax records based on an interpretation of tax policy would be systems of engagement.

Complexity and reality

All of this is great in theory, but in practice, things are never quite as simple. Some significant recent incidents with key government systems have shown how problems with complexity can have terrible real-world consequences: 

1) Processing errors and HMRC’s misdirected penalty notices

Increasingly we are relying on automated systems that are anything but infallible. Programming errors that result in miscalculations are uncommon, but mistakes still occur regularly. DWP’s universal benefits system is particularly renowned for such errors. 

Errors can also occur when physical tasks are automated, such as with print runs that produce unintelligible documents or where automated envelope stuffing machines insert several letters into a single envelope.

HMRC has come under fire for numerous data breaches via penalty notices issued over the years. The errors question HMRC's claim that once a penalty notice has been ‘issued’ then it must have been received by the intended recipient. 

Evidence of such errors has undermined HMRC’s standard assertion in tribunals that the fact that their system posted the notice, and it was not returned, means that it must be presumed to have been validly served on the taxpayer.

2) Logs, accountability and the Post Office Horizon system

Access to systems of record needs to be restricted. Accurate logs need to be kept of all entries and changes, along with who made them – providing confidence in the accuracy of record-keeping as well as accountability.

In a startling miscarriage of justice, 39 Post Office convictions were recently quashed after evidence provided by Fujitsu about the infallibility of its Horizon IT platform was called into question. 

A number of Post Office staff had been convicted and sent to jail on the basis of logs provided by the Fujitsu Horizon system as evidence of wrongdoing. The system was meant to be infallible and all entries were meant to be logged and traceable until it was found that the logs could be bypassed. 

3) Multiple version of the truth and DWP’s data warehouse issues

Managing data warehouses and establishing a single version of the truth is a common issue with many large organisations and government departments. Distributed computer systems with multiple data entry points often have multiple data pools, leading to duplication or proliferation of records. 

This problem can occur where, for example, multiple government departments use your national insurance number as a unique identifier, but don’t have integrated systems. If you update your address on one system then it is not automatically updated on the others and the government then has you registered at two different addresses and doesn’t know which record is correct. 

It also happens within departments, where database integration has been inadequate. Aside from its well-documented difficulties with the universal benefits system, the Department for Work and Pensions has been trying for years to move off Oracle Enterprise Data Warehouse in pursuit of a single version of the truth.

We have a big problem

All accountancy relies on a valid set of books from which accounts are derived. Three conditions are required: 

  1. They need to be valid/truthful, 

  2. any changes need to be logged accurately and the identity of the person making changes needs to be known for accountability, and 

  3. there needs to be a single version of the truth. 

The recent examples above show that there are massive problems in all these areas – and that they are not only relatively wise-spread, but they can also have really serious consequences. 

And it's only going to get worse

Artificial Intelligence (AI) has been heralded by some as a means of automating error-checking as well as spotting anomalies that could uncover potential fraud. In reality, AI is a double-edged sword. Automated systems can be used to spot and correct human error, but as it starts to be used to make changes to systems of record, how can AI be held accountable? 

How do you know if rather than correcting human error, an AI system is making mistakes of its own? And when it makes mistakes, who is accountable? The vendor that developed the system, the one that implemented it and set it up, or the team that has maintained it ever since? 

How do you know if the mistake was caused by an isolated but unfortunate error, or a fault that could reoccur, or even worse by malicious actions by either your own staff or by outsiders? 

Hackers are already seeking to gain access to AI systems in order to game them or manipulate them? The big problem with such automated systems is actually knowing if or when they’ve been hacked in this way?

Realising that your AI system has been hacked might not be enough. If there have been numerous automated changes across numerous interrelated records then unpicking it all and establishing an accurate audit trail could be impossible – especially if hackers cover their tracks by deleting or amending logs.

Systems of record are absolutely essential and AI may well be an ‘intelligent’ enhancement, but neither of them is infallible. 100% reliability for automation is a bit like 100% protection for cybersecurity – both are myths. You have been warned.

Replies (7)

Please login or register to join the discussion.

the sea otter
By memyself-eye
18th May 2021 18:41

Tax should be fair, easily understood and 'simple' to collect.
That mantra went out of the window years ago.......

Thanks (2)
avatar
By dmmarler
20th May 2021 10:17

And if you want an even larger mess, use a computer. The government still has not understood and only listens to and believes the salesmen.

Thanks (0)
avatar
By Mike Nicholas
20th May 2021 11:02

Great article.
The universal credit system has a long history of programming failures, particularly with calculating the amount due when there are fluctuations in the timings of payments of earnings reported by employers. For many years the DWP simply preferred not to contemplate revising the calculation programming.
As regards duplication of records, this continues as a 'feature' of HMRC's PAYE system.
AI is not some panacea for poor programming nor for deluded or 'ignorant' individuals who have far too much faith in their software.
Indeed, AI is simply programming but using the term AI is new sales speak, and suggestive of some 'higher' machine learning that can remedy human mistakes.

Thanks (1)
By Nick Graves
20th May 2021 11:33

As an ex-IT guy said unto me, "just because you can do it doesn't mean you should do it".

We've seen the abandonment of the KISS principle in just about everything (from rendering appliances & vehicles irreparable) to frankly ludicrous over-regulation everywhere.

It really feels sometimes, like Western Civilisation will end by tying itself into impossible knots.

Thanks (3)
avatar
By North East Accountant
20th May 2021 13:02

Remember in school the teacher used to bang on saying show your workings out as to how you arrived at your answer.

It's not rocket science but a fundamental principle of life.

It's a pity HMRC (PAYE arrears notices etc) and Horizon do not grasp this most basic of concepts.

The answer is £18,326.84.... how did you arrive at that HMRC?

We have a PAYE arrears notice for a client for this exact amount, no-one at HMRC (who issued it) has a clue what it relates to but they are trying to collect it.

Doodle compared to MTD....

Thanks (1)
Replying to North East Accountant:
avatar
By flightdeck
20th May 2021 18:00

If they cannot show how this figure is arrived at then surely your clients does not have to pay? People can't just make up number and demand money - there must be some fundamental law around someone doing that?

Thanks (0)
This is me
By LVW4
20th May 2021 15:15

So true, Bill. But what's the answer?

Whenever a large government department embarks on a major transformation project, the emphasis is on what they do today, and not what they could do tomorrow. Agility and innovation offered by smaller suppliers is sacrificed on the altar of constantly awarding contracts to the large, politically connected suppliers, who often have a vested interest in maintaining the status quo, and we are back into a 5 year project cycle, at the end of which very little has changed. Except, the systems have become more complex, the suppliers have made a lot of money, and the government department refuses to hold the supplier to account for problems because certain department heads would have to be held to account for their decisions in awarding and managing the contract... catch 22.

The Horizon project is such an example. Without a Public Inquiry, Fujitsu won't be held to account because they are too deeply embedded in numerous government contracts, and nobody at a senior level at the Post Office will blame Fujitsu because it will reflect badly on them.

Thanks (2)