Who should lead your master data improvement programme? Someone who really cares

30 May 2014

John Hannah

John Hannah

Consultant

Even in todays interconnected, 24/7, tech-savvy business environment, very few companies can claim to be free of problems relating to the quality of their master data.

There are many reasons for this, but for now let’s simply assume that your organisation is one of the many, not the few, and that you want to do something to address this. Let’s also assume that you’re in the process of setting up a master data improvement programme or initiative, and that you’ve reached the point where you need to appoint someone from within your organisation to lead it.

Who do you pick? What qualities do you look for? In my experience, this decision is a key determinant of the level of success of the eventual outcome.

My objective here is to highlight some of the special characteristics required of such a leader, which can make a big difference to the approach of such a programme, and to the benefits which can be achieved from it.

The Basics: Cross-functional understanding and influence

Let’s start with some of the basics; what are some of the characteristics required for a successful leader of a large master data improvement programme? Ideally they’d include the following:
 

  • Business understanding. They need to know the business well, and be able to operate with a cross-functional perspective, acting as an honest broker between the many different business functions (each with their own agendas) which are inevitably involved as participants in many master data processes
  • Programme / project management skills. They should have project management and diplomatic skills so they can operate effectively on the ambiguous boundary between the business and IT. Master data improvement frequently demands an array of related initiatives – both business and IT focused. It’s rarely an isolated one-off undertaking. Many different resource inputs may be required, often on a part-time basis, and usually from departments and functions that may feel they have better things to do
  • Political power. Any serious master data improvement programme will typically require changes to data-related roles and responsibilities. Often this can include tension between centralisation and decentralisation of control. So the person leading this initiative must be credible enough to be able to influence both top management and the other stakeholders involved. In particular, he / she must be able to find ways to articulate the business case for master data improvement effectively. Failure to establish and maintain a convincing, and sustained, business case is one of biggest barriers to success of master data improvement initiatives.       

Not a particularly radical list of characteristics, I suspect you would agree. But there is one more characteristic missing, and it is possibly the most important of all:

  • An interest in master data. I don’t mean just interested in executing a well-run project, so that they can quickly move onto the next one. I mean someone who is genuinely interested in master data, and who understands its importance to the business at multiple levels. Let’s face it - to many people, master data may not rate among the most exciting topics in the business world. But companies cannot operate without it. And if you have someone who does care about it, it can make a huge difference to the quality of the outcome.

Someone who cares

There are several ways in which the appointment of someone who cares as leader can influence the success of a master data improvement programme.

  • Formulating a compelling business case. Someone who cares is best placed to make a convincing case for the master data improvement initiative. By understanding both the nature and causes of data errors, and being able to describe this in terms of the potential business impact, this provides the ability to articulate the business case in ways that the sponsors can understand. Because in many organisations there can be few measured KPIs relating specifically to master data quality, one of the most powerful ways to help make the business case is to dig out the “war stories” where real pain was caused by previous master data problems (eg “when we had the wrong price on the website…”). Every organisation has its share of these
  • A focus on the sharp end. Someone who cares will spend time with those directly engaged at the coal face of master data creation and maintenance, to understand their day-to-day issues and experiences. Often those executing the master data maintenance are best placed to identify the true causes of the issues, and, quite often, they can suggest options for resolving many of them as well. Too often, they are overlooked as mere “administrators” and barely consulted
  • A willingness to get stuck in. Someone who cares will be interested enough to get involved in studying the master data itself in enough detail to identify error patterns and execute basic root cause analysis. If you match someone who understands the business well with a large extract of master data records, you will be surprised how quickly and easily it is possible to spot recurring patterns, errors and potential areas for improvement. This goes beyond straightforward errors like missing data, to include data inconsistencies, incorrect / illogical entries, and more
  • Perspective. Someone who cares will resist the urge to throw new technology in as the solution before the basic problems are properly understood. Dedicated technical solutions, which I call “big MDM”, certainly have their place. But they should not be used a substitute for proper attention to business roles and processes for master data, which I call “small mdm”. Jumping straight to big MDM can be a very expensive way of trying to solve master data issues which could be addressed more simply via some focused attention on small mdm.  

Pitfalls to avoid

It also follows from this that a common factor in less successful master data improvement programmes is that in such cases nobody cares enough. I don’t mean that no-one involved cares; of course they do. But too many data improvement initiatives either lose their way or fail to deliver the level of benefits that they should.

I have seen such initiatives fail to reach their potential for a variety of reasons:

  • The project leader is too remote. The leader of the improvement initiative is too remote from the master data and from those involved in its creation and maintenance, and not interested in the detail, resulting in superficial analysis, lightweight remedies and missed opportunities for improvement
  • Lack of influence. Sometimes the key points of failure are known but the political clout to address them is missing. For example, it may be known that most of the poor quality product data may originate from one rogue marketing team, who are known to be weak on their administration, but the central data team, and the leader, lack the power to influence marketing management to address it
  • The big picture is missing. Those involved in the day to day master data administration may be too immersed in the procedural detail to raise their heads up and observe or address the repeating patterns of data quality issues. No-one is asking the “why do we do it this way?” question strongly enough. If the leader also fails to see or articulate the big picture, then it becomes harder to justify and maintain the business case for action
  • Short term outlook. Sometimes great effort may be expended on an initiative to clean up the data errors, but without sufficient attention to the root causes of those errors. This then becomes a vicious circle, with a repeating cycle of data cleansing effort which provides only short term respite because the more strategic, end to end process view is missing
  • Lack of a joined up strategy. The worst single case of this I have seen in an organisation was where there were two simultaneous data improvement initiatives in place for the same vendor master data, each working in ignorance of the other. One was sponsored by finance, and one was sponsored by the purchasing function, with each potentially contacting the same vendors as part of the data cleansing process!         

In summary

All organisations need to remain vigilant to protect the quality of one of their greatest assets; their master data. Unless they take a joined-up, strategic approach to master data improvement, they won’t be able to fully realise the benefits that can be achieved. Effective master data management requires pro-active management effort as well as re-active administrative support.

Do not underestimate the level of challenge involved in achieving truly sustainable improvements in master data quality. And one of the best ways to maximise the success of your master data improvement initiative is to find and appoint someone who cares to lead it.   

View comments

Comments

Blog post currently doesn't have any comments.

Bluefin and SAP S/4HANA - welcome to the one horse race