ASG Perspectives

Blog > October 2020 > ASG EVOLVE20: Takeaways for Data Leaders

ASG EVOLVE20: Takeaways for Data Leaders

Wow - ASG’s Virtual EVOLVE20!  What a great week spent with phenomenal data leaders across many profound organizations and partners who are truly making a difference with data!  No longer is the talk track, what is metadata, what is data lineage, what does it have to do with data.  The talk track has evolved to data intelligence, data piping, data democratization, data aware…

This blog highlights session snippets, including links to the recordings of the most popular must-see sessions from our EVOLVE20 partners and clients.

1) Information Builders(iBi): Using “Business Ready” Trusted Data for Better Decision Making

  • Dennis McLaughlin - VP, Operations and Product Management at iBi
  • Vincent Deeney - World Wide Strategic Services at iBi

Dennis and Vincent showed how their combined iBi/ASG DI solution works to not only prepare “business ready” data but build trust in the data for the organization as a whole.  High quality data will ensure that data is integrated proficiently. They noted that automating the data management life cycle is mainstream now for clients that view data as a critical asset to the organization.

The iBi solution will profile the organization’s most critical data to tease out common data issues.  It will work to understand whether the data is structured correctly in the application and if it has met the organization’s thresholds of integrity for its most critical data. iBi will invoke data quality rules to ensure its utmost integrity.

The combo iBi/Data Intelligence (ASG DI) solution appends the data quality results with tagging, curating and approving the critical data in accordance with the data quality rules.  The lineage map will pinpoint exactly where good and bad data resides and how it is propagated across the organization, as well as externally to the company.

One interesting pathway iBi is currently transitioning from is a rule-based experience to inferred-based rules.  iBi is finding that tools can do more of the work for you based on industry experience.  Most rules are not written from scratch anymore, and many rules can be pre-loaded to apply AI techniques to learn and improve the data stewardship process in general.

Master Data Management adds data value

From a Master Data Management (MDM) experience, your highest value data (person, patient, plant, franchise, product) is protected and trusted with the inclusion of ASG DI data lineage. The Master Data is also being automated by the cumulation of industry patterns and experience.   Dennis likened the new MDM experience to today’s trainsets that arrive with out-of-the-box modules versus spending hours and hours on the intricacies of the tracks and power to make it run.  This new MDM experience masters the processes that were already learned within the industry and potentially decreases a typical MDM time-to-joy from seven months down to perhaps three months.

Finally, the capability to visualize data from various business areas by context promotes a metadata driven approach.  This will help you prepare and profile the data as you instantiate your new environment or ship shape your legacy system.  In the end, it provides you with access to the highest quality, most protected data…that you can rely on!

2) Citizens Bank – Establishing the Citizens Data Mgt Framework

  • Vinay Jha – CDO at Citizens
  • James Varkey – SVP Data Mgt at Citizens

Vinay kicked off the session with his strategic data goal of producing an “output driven data strategy.”  Citizens Bank’s approach is to begin with a business problem and produce a data strategy to support the business initiative. This alignment with business strategy helps them prioritize their immediate and longer-term target data strategies.  The end goal is data democratization to build data trust in knowing that the business users are always utilizing the correct data from the right source and that it meets data integrity thresholds defined by the bank.

Metadata Dividends! Robust metadata is foundational to moving towards data democratization.  Citizens Bank scans application metadata on a regular basis. 11 million fields are kept updated and connected to the glossary. Basic questions like ‘where did the data come from?’ and ‘how is the data being used?’ are answered with data lineage. Dividends of this process are important for moving to the cloud, as well as regulatory requirements.

  • GLBA Safeguards ensure the protection of sensitive data. Citizens Bank uses ASG DI to classify sensitive data to answer such questions as:  Is this Privacy Information? What classification does it have? Is it internal or external data? Is it confidential?  At any given time, GLBA regulators can obtain a report of where all the sensitive data is stored and used. Libor and CCPA are other regulatory initiatives where the ASG Data Intelligence is   used.
  • Automated Data Transfer Registration ensures the bank has a handle on where data is being transferred, what data is being transferred and how this information is traced via ASG DI lineage. Within the lineage, the security and controls are authenticated on these data transfers during this registration process.

3) Regulatory Roundtable with BNY Mellon, Accenture and ASG Technologies:

Accelerating your LIBOR transition and other hot-button topics

  • William Knapik – Principal Director at Accenture
  • Michael King – Director of Enterprise Governance at BNY Mellon
  • Sue Laine – ASG moderator

I want to emphasize “other hot topics” here.  This is the second time we have teamed up to showcase Libor with Accenture’s compliance expert ,Will Knapik, and Mike King, Director of Enterprise Governance, to discuss how clients are tackling retirement of the Libor reference rate and re-piping their rates by December 2021 – coined by analysts as a $350 trillion problem or the next Y2K for finance. In discussing BNY’s approach and ancillary benefits, the discussion became quite lively, making this an extremely popular session that we were asked to repeat for EVOLVE20.

Will kicked off the session with a background on why the Libor rates are changing and its impact across many different financial products. He discussed the various approaches to re-piping Libor rates and where he has experienced challenges and pitfalls in the general financial community.

Mike shared his insights into how lineage is key to defining the providence of each element, which is quite challenging since Libor began 30+ years ago, as well as the complexity of not only the rates but the diverse landscape that Libor rates travel.

In 11 weeks, BNY traversed 150 applications and 20 million lines of code with the DI technology.  This initial phase of the project provided the map of where Libor codes exist and how they flow across  their diverse data eco-system. The findings, or “map,”  is sent to the business units for the re-piping of the rates.  From this initial success, the BNY team has crafted an assembly line factory where they are profiling 100 applications a month.  BNY is confident that all rates will be removed and risk avoided through impact analysis and validation for compliance by the deadline next year.

Will mentioned that “tools help you monitor the tool and test the impact.  They test what the change impacts downstream and help you diagnose what the problem is if it propagates an issue.” They both concur that this cannot be achieved manually.

Dividends – Mike King

  • UDT’s – BNY identified technical debt where in some cases six or seven applications are doing the same thing. They were able to decommission and remove some of this technical debt during the Libor progress.
  • Root cause analysis – By visualizing the point of production to the point of consumption, BNY was able to identify and remediate long-standing data issues quickly.
  • Delta Scans – The BNY Data Intelligence team uses ASG DI for scanning deltas as changes are made to applications. This regular change review curtails data issues prior to hitting production.

Unpacked Discovery – BNY learned exactly what was happening to data integrity at the element level across the entire stream of system of origination out to the reports.  Many data discoveries were made tracing critical data elements to and from the reports.

Rationalization of Regulations – Application forensic reviews occur beyond the Libor re-piping efforts. These regular reviews show that more than one regulatory effort is intersecting with the data when you look at it at a higher level. GDPR, MIFID 2, Market Requirements for the Fed are other regulations that can be rationalized and sustained with the Data Intelligence solution.

Advice
  • Start small and scale big
  • Put yourself in a more defensible position by using automation and quickly respond
  • Delta snapshots pay huge dividends in trouble shooting impact to the regulatory reports in general.  Folks really start to see the advantages of the tool at this point and

There were other excellent sessions at EVOLVE20 as well, and I want thank Mike McNea from Micron who educated other clients on how to scale scanning and discovery at an enterprise level.  And, of course, our French sponsor, JEMS, who highlighted using Data Intelligence to drive regulatory resolve in Europe.  And finally, yours truly had the keynote session, which highlights the journey from compliance to disruption and how we are achieving this transition with our partners and ASG.

I encourage you to watch these sessions and more on-demand on the EVOLVE20 Virtual Event platform!

Posted: 10/22/2020 8:00:00 AM by Susan Laine
Filed under :Accenture, BNY_Mellon, Citizens, Data_Intelligence, EVOLVE, EVOLVE20, GDPR, iBi, JEMS, LIBOR