Questions are being raised about the effectiveness of the approach adopted by the UK Information Commissioner’s Office (ICO) after two major cases in which people’s lives were placed at risk were punished only by the issue of reprimands.
In April, the UK Ministry of Justice was reprimanded after 14 bags of confidential documents were left in an unsecure holding area of a prison for 18 days. The information included medical records and security vetting details of both staff and prisoners. Those details were potentially viewed by 44 people, including prisoners who were seen reading the documents by staff.
In May, Thames Valley Police was reprimanded for the disclosure of information that led to the address of a witness being found out by suspected criminals. The witness was forced to move house, but the risk to this individual remains high. Police officers failed to redact information from a reply to a request for information from a housing authority.
In an in-depth article published by trade journal Computer Weekly, a number of legal and data protection experts have criticised the ICO for the limited action it has taken. The decisions to reprimand rather than fine are evidence of the approach the ICO announced last June being implemented.
Impact of fines
The ICO’s view is that fining public organisations reduces the resources they have available, potentially affecting the public more than the service fined. Information commissioner John Edwards is quoted as saying: “the impact of fines issued to the public sector is often visited upon the victims of the breach themselves, in the form of reduced budgets for vital services”. He adds that fines against public sector bodies “do not affect those responsible for the breach in the same way that fining a private company can affect shareholders or directors”.
But those quoted in the Computer Weekly piece are concerned that this effectively means public sector bodies will never receive punishment that amounts to more than a slap on the wrist. The ICO’s argument is that practical improvements are the preferred outcome, and while critics acknowledge the importance of outcomes, the concern is that confidence in the regulator will be undermined, that organisations will fail to make the necessary improvements, and that individuals will be left to their own devices to secure adequate compensation.
The issue of confidence in the ICO will become increasingly important in the debate on the regulation of AI, and in light of concerns being expressed about the provisions of the UK government’s proposed Data Protection and Digital Information Bill.