The following entry is a record in the “Catalogue of Catastrophe” – a list of failed or troubled projects from around the world.
Project type : Integrated Public Health Information System
Project name : Panorama
Date : Apr 2020
Cost : $115M
Audit reports can be a gold mine of information that reveal insights into the status of a project. One such report that caught my eye this week covers an IT system developed in Canada to help identify, manage and track a pandemic (a timely topic given the situation in which the world now finds itself).
The project in question was initiated by the Canadian government as a follow up to experiences gained from the 2003 SARS (Severe Acute Respiratory Syndrome) outbreak. Although that outbreak was relatively limited in scale, it prompted health officials to consider how they would handle a bigger outbreak such as the 1918 Spanish Flu pandemic that killed tens of millions of people worldwide.
The resulting project was titled ‘Panorama’ and stemmed from a federal government mandate that sought to create a “seamless public health system that will allow public health professionals to coordinate activities in a carefully planned infrastructure.” Co-sponsored by the provincial Government in British Columbia (BC) and the federally funded non-profit ‘Canada Health Infoway’ (whose mandate is to help province adopt digital health solutions), the plan was to build a system that could be used on a national basis.
In principle the vision was solid, but in execution the project was plagued with problems. The original 2006 plan called for a number of available COTS systems (Commercial Off-The-Shelf) to be integrated and adapted. Integrating the requirements of different provinces within the inherent constraints of a COTS based approach however proved much more difficult than thought. By 2008 the problems were clearly manifest and the strategy was dropped. Following a reset the project pivoted to a solution based on custom software development.
At the time the audit report was written 7 years later (2015) the system was partially operational, but de-scoped requirements, quality problems and issues launching the product resulted in a system that fell short of its original aims. According to the report, key requirements that had been dropped from scope resulted in a system that “cannot be used to manage inter-provincial outbreaks, the main reason for which the system was built.”
Diving into details the report provides an inside view of a project that clearly went off the rails. Some of the comments in report are almost laughable (if this wasn’t a system in which the public’s health and wellbeing are at risk). Apparently the system suffered from ‘usability’ issues. To illustrate, the report notes that in places the button labeled ‘save’ didn’t mean ‘save’ instead it meant ‘cancel’. Elsewhere buttons labeled ‘submit’ and ‘cancel’ could mean ‘save’! When launching a complex new system designers need to gain the confidence of the user community and such rudimentary mistakes must have immediately framed unfavourable attitudes towards the new system.
Elsewhere the report is a little frightening. It appears the system itself experienced a ‘pandemic level outbreak’ of bugs and glitches. According to the report, more than 11,000 defects were identified after the system started its rollout. Again to quote the report: “In the end, the accepted system did not meet user needs, and contained thousands of defects. Significant remediation was required along with the identification of more than 320 workarounds to make the system usable.”
To conclude the report provides a set of insights into what went wrong (quoted for the report):
- The initial COTS approach was unrealistic and the project’s complexity was underestimated
- The project lacked a sufficiently experienced or robust leadership team to manage the effort
- Decisions were made without full, unbiased project information and options were not effectively considered or evaluated
- Stakeholders had competing priorities that were not effectively reconciled
- Acceptance testing was inadequate and the system was accepted from the vendor prematurely (i.e. before the full scale of the defects was understood).
Reading between the lines I then also see a poorly structured contract with the supplier. The report does note that the vendor was able to transfer risk to the public and was not held accountable for the stratospheric level of quality flaws, but that in turn points to ineffective contract negotiation and / or contract administration.
For those aspiring to manage large-scale, complex projects the report is recommended reading. A system of the type originally envisaged by Panorama is of a level of complexity that is not immediately apparent to the untrained eye. It’s not just about writing code and designing database schemas. In fact, the software side of the project was in the grand scheme of things the easy part of the project. As always, the human side is where the challenge lies. In the human side a project like this is about unifying work processes, it’s about synchronizing terminology and it’s about gaining consensus across divergent stakeholders. Add to that the challenge of working with technology vendors who can hide behind a contract and the stage is set for problems to fester. All too often those types of challenges go unseen as organization focuses on the physical hardware and software.
So of course here we are in 2020 and Panorama’s time in history has arrived. In talking to some in the Canadian health care sector it appears parts of the system are in use, but key components that should have been at the forefront of the battle against Covid-19 are not.
Contributing factors as reported in the press:
Lack of stakeholder engagement. Insufficient experience in the project’s leadership structure. Poor strategic choices in the early stage of the project. Poor procurement management practices. Lack of design standards. Quality control issues.