

A quick conversation at the side of a desk to check whether “account” in one document means the same thing as “policy” or “plan” or “contract” in another gives way to remote assumptions with all the inherent dangers they entail. Over the years I have seen numerous examples of hard work undone by the ambiguity of language or a lack of precision in the definition of terms. It was those experiences that led the team at Altus to develop our distinctive model-driven approach to change.
Analysing the programme failures we had witnessed over our different careers, we saw a common pattern. A programme would start strongly, report good progress throughout the analysis phase and swiftly move into development. PMO reports were green and the various project streams would quickly move to 80% complete. But then they would stick there, stubbornly as more and more issues were escalated to the risk log. The issues would usually turn out to involve some aspect of another stream which, on further investigation, would reveal some incompatibility between the different streams. In the worst cases, one of those issues would eventually torpedo the entire programme. On reflection, those issues should have been spotted much earlier in the programme when they could have been fixed relatively painlessly. The key, we reasoned, was to have a robust model of the business and to use that model to test the planned change during analysis rather than seeing what happens in development. Much the same approach, in fact, that architects and surveyors adopt to designing buildings before construction begins.
So, what should that model look like? Rather than relying on our collective experience and some workshops to thrash one out, we decided to take a step back and develop a method for building our models on the most solid foundation available in financial services – data. Our industry produces no tangible output; instead it consumes data, transforms and records it, then passes it out in the form of policies and statements. What better way then to model a financial services business than to chart the flow of data to and from that business and then identify the operations that transform it? Since that Eureka moment, we have applied this technique to over 100 businesses and shaped projects ranging from the selection of technology components to the outsourcing of entire operations. Importantly, the method is repeatable and can be learned which means clients can tailor and extend the models for their own business without the need for continual support.
In the early days, the work was done on whiteboards, recreated in Visio and analysed in Excel. We added formulas, macros and links to automate more of the process but, over time it became increasingly difficult to manage the various versions, linkages and duplication. That’s when we decided to add our own technology expertise into the mix and build a dedicated tool to manage the models and how they are used. We called it PEAK, which made lots of sense when our company logo included a mountain but now looks a bit more obscure since the latest rebrand!
When we set out to develop PEAK, we had some clear design principles which continue to guide its roadmap today:
These principles have led to the creation of what, I believe, is the most complete and flexible corporate memory in financial services. The comprehensive, industry models which form the backbone of PEAK enable us to add immediate value to almost any assignment and provided an instant framework for the COVID-19 analyses you can see on our website. The fact that this analysis was distributed around the team, completed remotely and then aggregated without gaps or overlap neatly demonstrates the value of an engineered model.
As it dawns on firms that remote working is here to stay and that change still needs to happen, a new distributed approach to projects will become essential. Whatever shape that takes, the use of models will be central, and we expect a new breed of change professional to emerge – the model citizen!