You have most likely heard the phrase “garbage in, garbage out”, right? According to World Wide Words the phrase has been in use since the early 1960’s and refers to the notion that your results in computation are only as reliable as the data you input. Today, this principle remains just as current as it was 50 years ago, only with the difference that nowadays just about everything is computerized, which means that you will be facing huge piles of garbage if you don’t ensure the data quality as early as possible.


Treat the cause, not the symptom

Considering the fundamental importance of data quality, it is surprising how much it is still being overlooked. Organizations struggle with IT service management quality issues, but often tend to focus on treating the symptoms (for instance re-engineering seemingly inefficient support processes) rather than treating the underlying causes, which can be, for example, outdated and incomplete service data, change resistance, or lack of relevant skills.

Sure, poorly designed process can be the cause in itself also, but if data quality doesn’t feel so important to you, you can ask yourself these questions:

  • Does your Service Desk have to often forward “easy” incidents to the Onsite Support just because they don’t have proper information on the customer’s hardware and software? And which one is more cost efficient: Service Desk ticket resolution or Onsite Support ticket resolution?
  • Are you able to do timely hardware renewals, based on complete and current data about hardware leasing contracts or warranty times?
  • Do you often have to clarify and defend your invoicing basis (whether it is internal cost allocation or invoicing the “real” customers) since you don’t have current and transparent data to rely on?

These are just some of the typical challenges for which the improvement of data quality can bring relief, but with a small effort you can probably come up with several more similar challenges.

Consequences of poor data quality

Arcplan proposes three general consequences for sub-par data quality:

  1. Mistrust,
  2. Bad or delayed decisions, and
  3. Wasted money

There can be other consequences as well, but these probably are such examples that most of us can relate to. Gartner research* shows that poor data quality is a primary reason for 40% of business initiatives failing to achieve their targeted benefits and that data quality effects overall labor productivity by as much as 20%. If this still doesn’t convince you, check out this SAP infographics about the cost of dirty data.

What to do then?

So if you are not capable of turning garbage into gold, make sure that you don’t put garbage in. Even the top-notch IT service management solutions, such as ServiceNow or Efecte, are not capable of turning the garbage into gold, so make sure they are fed with quality data. Easier said than done, sure, but there are four basic steps in this process that have to be in order:

  1. Discover and inventory current data from all company devices. Make sure that you can do it whenever needed and for all devices – even when they are not in company network.
  2. Enrich the data from other sources, such as antivirus systems and financial systems, to ensure data completeness. The enriched data needs to be current as well, which should be taken into account when considering the integration methods.
  3. Cleanse and normalize data to make it accurate. It has to be presented in an understandable format and there cannot be multiple versions of the truth.
  4. Make this accurate, complete, and current data readily shareable, so that it can be utilized by the IT service management, and as a result, brings value to you.

If you are able to implement these steps successfully, you have a solid foundation to build your IT service management on. Of course, the work doesn’t stop there, but at least you have one corner well covered when you come out victorious from the data battle!


If you have any questions or comments, please email us at

*Gartner (Ted Friedman, Michael Smith): Measuring the Business Value of Data Quality (October 2011)

Simo Kari

Simo Kari

Chief Content Officer at Miradore Ltd
Simo Kari has been the Chief Content Officer for Miradore since February 2014. Prior to joining Miradore in 2013 he worked several years in various management positions in HCL Technologies and UPM. He has versatile experience in developing, implementing, and operating IT services in international environments. Simo holds an MSc. from University of Liverpool. | LinkedIn
Simo Kari

Share Now

Share this post with your friends!


Newsletter Signup

Join our mailing list to receive the latest news and updates from Miradore.

Your request is now sent. Please check your mailbox for a confirmation email from Thank you!