Achieving data integration zen: building your business case
The closely related issues of siloed data and a lack of data integration – stemming, in part, from reliance on older data warehouses and a general inability to establish proper data governance – aren’t just slowing down your business intelligence efforts. They’re keeping you from competing in today’s data economy. Data silos play a huge role in thwarting organizational productivity, damaging profitability and gumming up employee attraction and retention . Indeed, Database Trends and Applications says poor data quality hurts productivity by up to 20 percent and prevents 40 percent of business initiatives from achieving targets. And a recent Gartner survey found the problem costs businesses an average of $15 million every year, adding that bad data hurts organizations competitively and exposes them to client mistrust. According to the Gartner survey, organizations can take several steps to help remedy poor data quality, including:
-
- Measuring the impact. Although the late management consulting guru Peter Drucker once said that “what gets measured gets improved," it seems like most organizations haven’t heeded this advice when it comes to data integrity. The survey indicates that 60 percent of organizations don’t measure how much all that bad data costs – meaning they’re not just unaware of the scale of the problem, but also in the dark about its potential impact on the rest of the business.
- Creating data stewards. Pythian advises creating an Analytics Center of Excellence to help plan, execute, promote and govern your data program out of the gate. But even if you don’t create a formal group, the Gartner survey says it’s crucial to establish data steward roles (and, if possible, a Chief Data Officer to oversee your data quality initiatives) to improve accountability and proactively prevent your program from going off the rails.
- Optimizing data quality costs. We know not all organizations ignore the data silo problem. But in their attempt to keep data integrity costs down, some spend way too much money in other ways: an annual average of $208k on on-premise data quality tools, according to Gartner. But migrating your on-prem data warehouse to the cloud can help in this regard. A cloud-native data platform (which is essentially a cloud data integration platform) is the most cost-effective and scalable way to ensure data quality and unity, especially when dealing with new types of data from edge devices and the IoT.
- Assess the maturity of your current MDM strategy (if you have one)
- Develop a cross-organizational implementation plan
- Identify your desired end state and understand the costs and benefits
Share this
Previous story
← Using RDA to check Oracle Database preinstallation
You May Also Like
These Related Stories
Building an ETL Pipeline with Multiple External Data Sources in Cloud Data Fusion
Building an ETL Pipeline with Multiple External Data Sources in Cloud Data Fusion
Aug 23, 2022
12
min read
How to build your very own Cassandra 4.0 release
How to build your very own Cassandra 4.0 release
Feb 13, 2019
5
min read
How to build a cost-effective serverless blogging platform on GCP - Part 2
How to build a cost-effective serverless blogging platform on GCP - Part 2
Mar 14, 2019
12
min read
No Comments Yet
Let us know what you think