How Information Technology Drives an Organisation to its Default Future

Peter Boggis argues that technology, particularly information technology, is a major factor determining an organisation’s default future, and one that needs to be fully understood by business and technology executives if they are to make informed choices on how best to navigate their organisation to an improved future.

My colleagues and I have argued consistently that every individual, organisation and even every country has a default future. This is where they will end up if no action is taken, other than that currently planned. For some organisations, their default future may be perfectly acceptable – possible, but unlikely. For most it provides a focus for taking the actions necessary to create an improved future.

In a recent article my colleague David Trafford argues that an organisation’s default future is ultimately determined by a small number of driving forces, which include regulation, the economy, customers, past winning strategies, talent, technology and peoples’ mindsets. In effect, these driving forces define the trajectory for the organisation that will ultimately lead it to its default future. It’s therefore the role of leaders to identify and influence those driving forces that they can control – and isolate their organisation from those over which they have no influence.   

In this article I want to focus on technology – particularly information technology – as a dominant force that is driving an organisation to its default future. I’ll argue that this driving force has three dimensions, each of which needs to be understood if informed choices are to be made on how best to navigate an organisation to an improved future. I should stress that technology is one, albeit important, driving force and leaders planning an improved future need to give appropriate consideration to the other forces discussed in David’s article. I also want to introduce a different way of thinking about making future technology choices and the role that CIOs need to take in helping their executive colleagues understand the implications of technology choices.

Dimension 1: Legacy of past technology choices

It’s safe to say that, today, most organisations are critically dependent on their IT systems. For many the technology is so pervasive that it’s difficult to think about change without first considering the IT implications. In information-intensive businesses, like banking and insurance, the systems have become so complex – after years of incremental enhancements – that the effort and risk of making further changes is significant. A recent example is the problems faced by Royal Bank of Scotland when they undertook what they thought was a routine upgrade. The outage resulted in some 12 million customers not being able to access their accounts for up to three weeks and businesses not being able to pay their employees. 

Complexity is not the only issue – as many of the systems still in use today were designed to support business operating models of the past, for example supporting business functions and products as opposed to processes and customers. This lack of alignment between the architecture of the installed base and the desire of the business to operate in different ways can significantly constrain leaders’ choices on the future of their business. Past technology choices – made with the best intent in a different context (see below) – create what is often called the technology legacy. The reality is that the legacy systems are always the ones that are at the core of the organisation and therefore drive it to its default future. The challenge is to understand the true significance of this legacy and make choices that will lead to an improved future. 

The importance of context

The reality is that decisions taken today about the use of technology will become tomorrow’s legacy. The only question is when will that legacy be recognised as a force driving the organisation to a new unacceptable default future? The issue therefore becomes one of time horizon. Would two, five or fifteen years be acceptable? While this obviously depends upon the pace of change facing an organisation, it also depends upon the context of the choices being made.

For example, if the context is one of cost reduction it’s most unlikely that an organisation would choose to re-platform its systems with ones that provide greater integration and agility. Equally if the context is one of growth and ambition it’s more likely that an organisation would embark on an ambitious plan to transform their industry, with the resultant risk that this brings. 

A classic example is the UK Department of Health’s National Programme for IT that started in 2002 – during a period of optimism and growth. According to a National Audit Office report last year, £6.4bn had been spent on the programme, with a further £4.3bn earmarked. It is now acknowledged that the programme will not meet its original objectives and the value delivered from the investment is questioned by many.

Context is everything as it defines the lens by which we see the world and frames our thinking when it comes to making choices. 

Dimension 2: Impact of emerging technologies

Over the past 50 years we have seen many examples of how information technologies have fundamentally changed industries and societies – the most obvious examples being television, mobile phones, digital music, digital photography, the internet and social media. It’s important to note that each of these so-called disruptive technologies created their own demand. To quote Steve Jobs; “You can’t ask customers what they want – you have to give them something they don’t yet know they want”. This is equally true when thinking about the potential contribution that new and emerging technologies could have on improving an organisation’s future. The simple answer is we often don’t know until it happens. Equally, these emerging technologies could have a profound impact on an organisation’s default future by making it even more unacceptable. In fact it’s not that the organisation itself changes, but the environment or context within which it operates and how it responds or reacts to this change. A recent example is the impact of digital photography on Kodak. The irony is that Kodak was a pioneer in the development of this technology.

The reality is that most new technologies like beyond the iPad, cloud computing, NFC (Near Field Communication), flexible screens, TV everywhere, voice control, second screen experiences, 3D printing, RFID, HTML5 and Watson are already present. Some will be successful and create their own demand and some will fall by the wayside. To quote William Ford Gibson, the American-Canadian writer; “The future is already here – it’s just not very evenly distributed”.

If an organisation is not aware of the potential impact of emerging technologies on their industry – or unaware of how competitors are intending to use these technologies – it could significantly change their default future to one that is not only unacceptable, but one that will arrive sooner.

Dimension 3: Mindset and competencies

In her book The Last Word on Power Tracy Goss argues that individuals, groups and organisations all have winning strategies. It’s the set of behaviours that have delivered success in the past. We may not recognise that we have them or be able to describe them, but we act in the belief that they will contribute to delivering success. This may be true if the context does not change, but when it does, Tracy argues that a new winning strategy is required. Unfortunately, it’s very difficult, and sometimes impossible, for individuals, groups or organisations to re-align their winning strategy to the new context. IT organisations often face this challenge where the prevailing mindset or core belief is “what has made us successful in the past is sufficient to make us successful in the future”.

It’s often overlooked that most IT organisations are successful most of the time. In the main they do what is expected of them; they keep the systems running, fix problems when they arise and manage the risks that come with a complex installed base. They do everything to avoid their CIO getting a call from their CEO asking when the systems – and the business – will be up and running again. Unfortunately, this sometimes happens – as in the case of Royal Bank of Scotland mentioned above and Blackberry, where an outage of its email, messaging and browsing services affected over half of its 70 million customers over a three-day period in October 2011. Meeting these expectations develops a mindset and set of competencies that are appropriate to the current context. Should the context change – for example following a decision to transform the business through the re-platforming of core IT systems – the existing mindset and competencies are likely to become inappropriate. Add to this a decision to use an offshore provider to configure and maintain the new platform, and you create a context that most IT organisations would struggle with, particularly when they are also expected to manage the legacy systems during the transition.

The reality is that current mindsets and competencies (the foundation of winning strategies) create an anchor to the present. They become a force driving an organisation to its default future, one that IT and business leaders need to understand if they are to make the correct choices about how best to navigate to an improved future.

Aligning technology choices to an improved future

If the context is fixing today’s problems then the outcome is likely to be a better short-term future. If the context is creating an improved future then the outcome is likely to be better over the longer term. But how can defining an improved future be done in business terms that enable informed technology choices?

One of the most effective ways of achieving this is to begin by defining how the business wants to operate at a point in the future through a small number of guiding principles. This is a robust and proven approach to developing some objective and soundly-based choices. Let’s explore this approach further through a couple of examples.

Example 1: Becoming customer-centric
“We want to build and manage an integrated, consistent, coherent and complete view of a customer’s relationship with us – thereby enabling us to increase the number of products and services we sell and increase the breadth and depth of our relationship with customers.”

The implications of such a business principle for future technology choices are non-trivial. For example, many legacy systems in banking and insurance were developed decades ago and are product-based rather than customer-based. Banks and other financial services companies have spent un-told billions of dollars trying to be more customer-centric by leveraging data from inflexible legacy systems. In many cases, because of their age and design, no-one really understands how the systems really work and what impact changes may have.

But, if that is the direction the business wants to pursue, then future technology choices need to reflect this. In many cases, it will be easier and less painful – although costly – to simply start again, based upon a modern well-designed ‘bank-in-a box’ solution. As discussed earlier, taking this route could have significant implications for the IT organisation as it would need to rethink its role from being a systems developer to a systems configurator and integrator. It would also need to develop new competencies, including the ability to manage the often complex relationship with the platform provider.

For some traditional banks and financial services companies, the sheer size of the historic investment in legacy systems and the on-going cost of maintaining and enhancing them represent a driving force that they may not be able to change.

Example 2: Supporting rapid integration
“We will grow by acquisition and integrate those acquisitions in 90 days or less.”

This business principle articulates a conscious choice to grow through acquisitions rather than relying solely on organic growth in increasingly mature markets, and sets a target for realising the synergy benefits within a three-month time-box. Sounds heroic? Cannot possibly be done? Well in fact, such a business principle has been articulated by the likes of Kraft General Foods (with limited success) and the technology firm Cisco. The book Net Ready, which tells the story of Cisco’s evolution and growth, reveals that its CIO spent almost a decade putting in place a technology architecture to do exactly that. At the peak of its growth, Cisco was acquiring relatively small companies at the rate of several per month. The IT architecture had been designed explicitly to achieve the rapid integration into the “Cisco way of doing things”.

Other examples include many global pharmaceutical and chemicals companies where the sheer diversity and complexity of hundreds of different ERP systems around the world in hundreds of different countries simply represents too high a cost to serve – and also constrains growth. Rationalising and standardising these systems to a smaller number – five to seven is often quoted – enables more rapid growth in emerging markets, more cost-effective systems in larger countries and the opportunity to have a small number of Centres of Excellence, which can design and help implement ‘common template’ solutions much more quickly and cost-effectively. Companies such as Shell recognised more than two decades ago that this would be a long journey and over time moved from a multiplicity of national systems, to regional hubs and clusters, to truly global ERP solutions built around a small number of global business processes – some six to eight typically – each with global process owner.

So, the best IT choices are based upon a small number of clear direction-setting business principles. This is the minimum that a CIO needs to have articulated in order to develop the kind of IT architecture that enables the organisation to move to its improved future.

It’s also worth noting that some of these choices will be strategic (see my article Strategy – The World of Choices and their Implications) as they will be difficult if not impossible to reverse once taken, whilst others could be reversed, albeit at a cost, should the need arise.

Role of the CIO

If technology was a commodity, with no downstream legacy implications, then it could be argued that there is no need for a CIO. As this is clearly not the case, I believe that the prime role of a CIO is to prevent the business making technology-related choices that take it to an unacceptable default future, and encourage the business to make choices that lead it to an improved future. The CIO therefore has a crucial role to play, including: 

  • Creating a shared understanding with business executives about the default future that past technology choices are leading to.
  • Helping business executives understand the implications of future technology choices on the future legacy they will create.
  • Anticipating the impact that emerging technologies could have on both the organisation’s current default future and the role they could play in creating an improved future.
  • Recognising that the winning strategy for their IT organisation may not be what is necessary to lead the enterprise to an improved future.

I welcome your thoughts.

Peter Boggis                                             

Formicio Insight Article: How Information Technology Drives an Organisation to its Default Future

Post a Comment

Your email is never shared.