Enterprise Data Governance & Master Data Management

Course review
4.5/5
11 Reviewers

Introduction

Creating Trusted, Compliant Data for the Data Driven Enterprise


OVERVIEW

Having understood the requirements, you will learn what should be in a governance programme. This includes a data governance framework that covers what you need to govern data including roles and responsibilities, processes, policies and technologies. It also includes a master data management strategy and what you need to do to bring your master data under control. We will look at how to manage leverage make use of a business glossary, data modelling, a data catalogue with automated data discovery, data profiling, sensitive data classification and policy enforcement. We look at data cleaning and data integration, to provision master data and reference Data- as-a-Service (DaaS). We also look at how Customer Master Data is being combined with Data Warehouses and Big Data to create new Customer Data Platforms (CDP)

During the seminar we take an in-depth look at the technologies needed in each of these areas as well as best practice methodologies and processes for data governance and master data management.

Many businesses today are operating in a distributed computing environment with data and processes running across the data centre, multiple clouds and the edge. It this environment, with so much going on, data is harder to find and govern. Also master data, the most widely used data in any business, is becoming harder to find, manage and keep synchronised across systems. This two-day in-depth seminar looks at this problem shows how to successfully implement a data governance program including data quality, data access security, data privacy, data retention and master data management to create a 360 degree view of customers, products, suppliers and other core entities. It is intended for chief data officers, enterprise architects, data architects, MDM professionals, business professionals, database administrators, data engineers, and compliance managers responsible for data governance and management of specific master data.

The seminar takes a detailed look at the business problems caused by poorly governed data and how inconsistent identifiers and data names, poor data quality, lack of master data integration and synchronisation can seriously impact business operations, cause unplanned operational costs and destroy confidence in trust of BI and analytics. It also defines the requirements that need to be met for a company to confidently define, manage and govern data as well as create and share consistent reference and master data across operational applications and analytical systems on-premises and in the cloud.

AUDIENCE

This seminar is intended for business and IT professionals responsible for enterprise data governance including metadata management, data integration, data quality, data protection, master data management and enterprise content management. It assumes that you understand basic data management principles and a high level of understanding of the concepts of data privacy, metadata, data warehousing, data modelling, data cleansing, etc.

LEARNING OBJECTIVES

Attendees will learn how to set up an enterprise data governance program and to determine what technologies they need for enterprise data governance and master data management (MDM). In addition, they will learn how use key technologies like data catalogues, data classifiers and data fabric to discover and identify data and build an MDM system.


MODULE 1: WHY IS GOVERNANCE OF CORE DATA SO IMPORTANT?

This session looks at the increasingly complex distributed data landscape, the problems it brings and why companies need to invest in provisioning trusted, commonly understood, high quality data services across the enterprise to guarantee consistency. It also looks at why data governance, data integration and data management should now be a core competency for any organisation.

  • The ever-increasing distributed data landscape
  • The impact of unmanaged data on business profitability and ability to respond to competitive pressure
  • Is your data out of control?
  • Key requirements for Enterprise Data Governance (EDG)
    o Governing the capture, protection, use, maintenance and decommissioning of data
    o The need to monitor, assess and act to uphold policies
  • What is Master Data Management?
  • Reference Data vs. Master Data
  • Establishing a framework for governing your core data
  • Getting the organisation and operating model right
  • Key roles and responsibilities – data owners and data stewards
  • Core processes needed to establish and govern commonly understood data
  • Types of policies and rules needed to govern:
    o Data ingestion
    o Data integrity
    o Data validation
    o Data cleansing
    o Data maintenance
    o Data privacy
    o Data access security
    o Data lifecycle management o Data loss prevention


MODULE 2: A METHODOLOGY & TECHNOLOGIES TO GET DATA UNDER CONTROL

Having understood why we trusted data is so critical, this session looks at data governance framework and methodology for getting you core data under control. It also looks at the technologies needed to help govern your data to bring it under control. It also looks at how data catalog software, trainable classifiers, data loss prevention and data fabric software provide the foundation in a modern data architecture to govern data and produce trusted business ready master data for use across the enterprise

• Data governance and data management implementation options
o Centralised, distributed or federated

• The impact of ungoverned self-service data preparation – the need for collaboration in our business units

• Data management on-premise and on the cloud

• Creating data handling and data retention classification schemes

• Manual labelling of documents, email and content in Office applications

• A best practice methodology for producing trusted data

• Data Catalogs – the new platform for discovering, profiling, classifying cataloguing, mapping, and governing data

• The role of Data Fabric in cleaning, integrating, masking and provisioning trusted data for consumption

• The Enterprise Data Marketplace

MODULE 3: DATA STANDARDISATION & THE BUSINESS GLOSSARY
This session looks at the first step in getting data under control – the need for data standardisation. The key to making this happen is to create common data names and definitions for your data to establish a common business vocabulary in the business glossary of a data catalog.

• Data standardisation using a shared business vocabulary

  • The role of a common vocabulary in data governance, Master Data Management, Reference Data Management, SOA, DW and data virtualisation
  • Approaches to creating a common vocabulary
  • Business glossary– now a capability of a data catalog software
    o Alation, ASG, Amazon Glue, Collibra, Global IDs, Informatica Axon & Enterprise Data Catalog, IBM Watson Knowledge Catalog, Microsoft Azure Purview, Talend Business Glossary and Data Catalog, SAS Business Data Network
  • Planning for a business glossary
  • Organising data definitions in a business glossary
  • Glossary roles and responsibilities
  • Glossary term ratings, approval and dispute resolution processes
  • Utilising a common vocabulary in Data Modelling, ETL, BI, ESB, APIs, & MDM

 

MODULE 4: AUTO DATA DISCOVERY, DATA QUALITY PROFILING, CLEANSING & INTEGRATION
Having defined your data, this session looks at the next steps in a methodology, to get data under control is discovering where your data is and how to get it under control

  • Automated data discovery, profiling, classification and sensitive data detection using a Data Catalog
  • Mapping data assets to a business glossary
  • Setting policies to govern data
  • Monitoring data quality and policies across a distributed data landscape
  • A unified approach to producing trusted data
  • Using a data lake to stage area for data cleansing and integration
  • Building data cleansing and integration pipelines to produce trusted data products using Data Fabric
  • Data provisioning – provisioning trusted, safe data in a data marketplace for use in MDM, analytical systems, and transaction processing

 

MODULE 5: MASTER DATA MANAGEMENT DESIGN AND IMPLEMENTATION
This session looks at the components of a master data management (MDM) and RDM system and the styles of implementation.

• What does MDM 360 mean for master data entities, e.g. Customer 360, Supplier 360, Product 360…

• Components of an MDM solution
• MDM implementation styles and options
o Real-time master data synchronisation
o Virtual MDM (Index / Registry)
o Single Entity Hub vs. Enterprise Multi-Domain MDM

• Identifying candidate master data entities
• Defining a common vocabulary for master data entities
• Master data modelling
• Master Data Hierarchy Management • Master Data discovery – identifying where your disparate master data is located using a data catalog
• Mapping your disparate master data to your business glossary
• Profiling disparate master data to understand data quality
• Creating trusted master data entities using data cleaning, matching and data integration
• Master data matching – survivorship rules
• Implementing outbound master data synchronisation
• Standardising business processes to create master data
• Governing maintenance of master data
• The MDM solution marketplace
o Ataccama, IBM, Informatica, Magnitude, Oracle, Profisee, Reltio, Riversand, SAP, SAS, Semarchy, Stibo, Talend, TIBCO and more

• Evaluating MDM products
• Integration of MDM solutions with data fabric platforms
• Implementing MDM matching at scale, e.g. IBM MDM Big Match

  • NoSQL Graph DBMSs and MDM
  • MDM in the Cloud – what’s the advantage?
  • Sharing access to master data via master data services in a Service Oriented Architecture (SOA)
  • Leveraging SOA for data synchronisation
  • Integrating MDM with operational applications and process workflows
  • Using master data to tag unstructured content

MODULE 6: TRANSITIONING TO ENTERPRISE MDM – THE CHANGE MANAGEMENT PROCESS
This session looks at the most difficult job of all – the change management process needed to get to enterprise master data management. It looks at the difficulties involved, what really needs to happen and the process of making it happen.

  • Starting an MDM change management program
  • Changing data entry system data stores
  • Changing application logic to use shared MDM services
  • Changing user interfaces
  • Leveraging portal technology for user interface re-design
  • Leveraging a service-oriented architecture to access MDM shared services
  • Changing ETL jobs to leverage master data
  • Hierarchy change management in MDM and BI systems
  • Transitioning from multiple data entry systems to one data entry system
  • Transitioning change to existing business processes to take advantage of MDM
  • Planning for incremental change management

MODULE 7: FROM MDM TO CUSTOMER DATA PLATFORMS
This last session looks at the emergence of Customer Data Platforms (CDP) that combine Customer MDM, Big Data and Data Warehouses to create a Customer Data

Platform to support Marketing, Sales and Customer Service in the digital enterprise.

• What is a Customer Data Platform?
• Customer MDM Vs a CDP
• Components of a CDP
• The CDP Marketplace and what to look for
• Integrating CDPs with digital and traditional marketing, sales and service applications
• Creating a CDP in your enterprise

+ Read more

Arvostelut

The best in the course was:

  • A lot of up-to-date stuff, good experience stories, lots of good charts. Really clear English. – great Mike! Integration to big data very interesting.
  • The best part of the training was the more detailed issues relating to the construction of MDM systems at the end of the training.
  • Review and access to newer perspectives. New and old acquaintances on the course.
  • Skilled clear trainer.
  • Consistent progress on the subject.
  • The benefits and risks of MDM were clearly highlighted.
  • I got a good overview of the steps needed to implement a successful MDM system.
  • He was able to concretize the complex issue – a top expert!
Course review
4.5/5
11 Reviewers

Educator:

MIKE FERGUSON

Managing Director, Intelligent Business Strategies Limited

Mike Ferguson is Managing Director of Intelligent Business Strategies Limited. As an independent IT industry analyst and consultant, he specialises in BI / analytics and data management. With over 40 years of IT experience, Mike has consulted for dozens of companies on BI/Analytics, data strategy, technology selection, data architecture, and data management. Mike is also conference chairman of Big Data LDN, the fastest growing data and analytics conference in Europe.  He has spoken at events all over the world and written numerous articles. Formerly he was a principal and co-founder of Codd and Date Europe Limited – the inventors of the Relational Model, a Chief Architect at Teradata on the Teradata DBMS. He teaches popular master classes in Data Warehouse Modernisation, Big Data Architecture & Technology, Centralised Data Governance of a Distributed Data Landscape, Practical Guidelines for Implementing a Data Mesh (Data Catalog, Data Fabric, Data Products, Data Marketplace), Real-Time Analytics, Embedded Analytics, Intelligent Apps & AI Automation, Migrating your Data Warehouse to the Cloud, Modern Data Architecture and Data Virtualisation & the Logical Data Warehouse.

Read more

Enterprise Data Governance & Master Data Management

Course review
4.5/5
11 Reviewers
Theme:
Data Quality
Educator:
MIKE FERGUSON
Language:
English
Duration:
2 days
Location:
Remote training
Dates:
Contact

Koulutusohjelmalla / kurssilla ei ole aktiivisia aloituspäivämääriä, jos olet kiinnostunut kurssista ota yhteyttä.

Contact

Please contact:

 

  • This field is for validation purposes and should be left unchanged.

 

More than one participants from same company?

We also organize company-specific courses.

Course for company

You might be interested in

+