These days, the hot analogy in the analytics industry is that “data is the new oil.” Like oil, data must be found, extracted, refined and distributed. More companies are investing heavily in cutting-edge technologies like machine learning, artificial intelligence and big data processing to help them harvest, groom and apply data to the correct use case for maximum value. So why, then, in the midst of this prospectors’ rush, do studies of business intelligence (BI) implementations repeatedly indicate that 60 to 85 percent of BI projects fail?
While tech is changing rapidly, the nature of most data management efforts has stagnated. Traditionally, the IT team has been seen as an all-knowing and all-capable “data priest,” producing the exact report requested by the business. We’ve seen businesses put a lot of focus on acquiring and storing data as cheaply as possible, while neglecting the equally important business use case and governance aspects. Because of this, we often see that data management organizations (DMOs) are not able to withstand the waves of change from sources such as new technology, organizational drivers and government regulations like the General Data Protection Regulation (GDPR).
Armed with that historical knowledge, I want to offer a few considerations for organizations to take into account when analyzing their DMOs.
Click here to read the full blog post.
Don Loden
Add comment