ASG Perspectives

Blog > March 2019 > Data Intelligence: The Journey to Business Agility Starts with an “As Is” Inventory

Data Intelligence: The Journey to Business Agility Starts with an “As Is” Inventory

Here at ASG, metadata, governance and enterprise data management have been the cornerstone solutions for investment targets and implementation best practices. In just the last two years, there has been a heightened interest from a growing number of clients needing to strategically align their business to their data—and to really get it right this time. Our clients can no longer afford data delays that impact the agility of a business to react quickly and decisively to marketplace disruptions.
 
When starting out, we find that client organizations typically have pockets of managed data but for the most part it is not consistently applied, automated or providing a holistic view of the data across business borders and technologies. The data is buried in complexity, it is redundant and very at risk of corporate or compliance exposure.
 
To solve for this complexity, the data intelligence best practice is to get a hold of the “as is” data landscape. That is, what data is out there today, what percentage of the data is clean and what is ready to use? This inventory should be automated versus manually applied through a spreadsheet load and collected in a centralized repository. An automated inventory will provide the organization with visibility to the underpinnings of their business. At the end of this step, organizations will understand their data management infrastructure and any dark corners—revealing critical “unknown unknowns.” One of the common mistakes we see with our customers is when they skip this inventory step and try to start with catalog first and then leverage AI/ML to extract metadata. We call this the cart-before-the-horse-approach. Without a robust metadata foundation, these methods will not deliver the desired results and often clients will have to deal with false positives or inaccurate results.
 
That is why rationalizing the inventory should be step two. This step provides a business context and definitions around the inventory. The organization will immediately begin to consolidate and categorize the data to reduce redundancy and know where the data starts and ends. Data lineage will help to identify owners of the data, showing who creates the data, where it changes custody and who uses it. Lineage is a staple that should not be overlooked in understanding where data is originating from, where the data is propagated and being used and whether there is business value associated to the data or not. This step also provides a good foundation to start data governance programs, as the data will be searchable: categorized, classified and tagged based on usage.

Cart-Before-the-Horse-Graphic.jpg
 
From there, the journey continues:
 
  • Govern:  This step connects the information supply chain by assessing subscriptions and alerts for data requests and changes as data flows from origin to use. Organizations will be able to find data according to business context, maintain the trust in data, make it usable and comply with regulations. This step also provides a good foundation to drive modernization projects such as retiring old systems or moving on to cloud.  
  • Share: This step catalogs and datasets with collaboration features that enable access for data scientists and business users. Organizations can implement a self-service catalog so data consumers can access the right, trusted data for the task on their own. Data will no longer be controlled by a handful of specialists but will be accessible by more people with the right context they need to understand it. This step is usually a turning point for organizations to drive more data-driven decisions by teams and stakeholders who will be more involved in the process. This is usually where organizations shift to an “offensive” data strategy.
  • Value: This step aligns the business use case portfolio to the data and insights and is often where we see our customers leveraging AI/ML models for more predictive analysis. When organizations reach this phase, they have succeeded in the ultimate goal of finding and delivering the value of data to internal and external users—from reducing the time to market to migrating, modernizing and digitalizing by having easily assessible, trusted data on hand.
     
In the marketplace today, vendors are approaching clients to start with governance or data catalogs without first knowing the “as is” state of their data environment. This would cause guessing or assumptions on what data aligns with the goals of the business and typically it is still done in a very manual fashion. Is this how we want to manage our new AI models and algorithms, and the secret sauce of the modern organization? 
 
 


Comments
Blog post currently doesn't have any comments.
Subscribe