The Logical Data Warehouse: Design, Architecture and Technology

Kurssin toteutukset:

  • 12.06.2018 Paasitorni, Paasivuorenkatu 5 A 950 € + alv

Early Bird -tarjous – 30 %! Saat tämän alennuksen varatessasi kurssipaikan nyt, viimeistään 31.3.2018!

 Overview

Classic data warehouse architectures are made up of a chain of databases. This chain consists of numerous databases, such as the staging area, the central data warehouse and several datamarts, and countless ETL programs needed to pump data through the chain. This architecture has served many organizations well. But is it still adequate for all the new user requirements and can new technology be used optimally for data analysis and storage?
Integrating self-service BI products with this architecture is not easy and certainly not if users want to access the source systems. Delivering 100% up-to-date data to support operational BI is difficult to implement. And how do we embed new storage technologies, such as Hadoop and NoSQL, into the architecture?
It is time to migrate gradually to a more flexible architecture in which new data sources can hooked up to the data warehouse more quickly, in which self-service BI can be supported correctly, in which OBI is easy to implement, in which the adoption of new technology, such as Hadoop and NoSQL, is easy, and in which the processing of big data is not a technological revolution, but an evolution.
The architecture that fulfills all these needs is called the logical data warehouse architecture. This architecture, introduced by Gartner, is based on a decoupling of reporting and analyses on the one hand, and data sources on the other hand.
The technology to create a logical data warehouse is available, and many organizations have already successfully completed the migration; a migration that is based on a step-by-step process and not on full rip-and-replace approach.
In this practical course, the architecture is explained and products will be discussed. It discusses how organizations can migrate their existing architecture to this new one. Tips and design guidelines are given to help make this migration as efficient as possible.
All public courses are available as in-house training. Contact us for more information.

Learning Objectives

  • What are the practical benefits of the logical data warehouse architecture and what are the differences with the classical architecture.
  • How can organizations successfully migrate to this flexible logical data warehouse architecture, step-by-step?
  • Understand the possibilities and limitations of the various available products.
  • How do data virtualization products work?
  • Discover how big data can be added transparently to the existing BI environment?
  • Understand how self-service BI can be integrated with the classical forms of BI?
  • Learn how users can be granted access to 100% up-to-date data without disrupting the operational systems?
  • What are the real-life experiences of organizations that have already implemented a logical data warehouse?

Course Outline

Challenges for the Classic Data Warehouse

• Integrating big data with existing data and making it available for reporting and analytics
• Supporting self-service BI and self-service data preparation
• Faster time-to-market for reports
• Polyglot persistency – processing data stored in Hadoop and NoSQL systems
• Operational Business Intelligence, or analyzing of 100% up-to-date data
The Logical Data Warehouse
• The essence : decoupling of reporting and data sources
• From batch-integration to on-demand integration of data
• The impact on flexibility and productivity – an improved time-to-market for reports
• Examples of organizations operating a logical data warehouse
• Can a logical data warehouse really work without a physical data warehouse?
Implementing a Logical Data Warehouse with Data Virtualization Servers
• Why data virtualization?
• Market overview: AtScale, Cirro Data Hub, Cisco Information Server, Data Virtuality UltraWrap, Denodo Platform, RedHat JBoss Data Virtualization, Rocket DV, and Stone Bond Enterprise Enabler
• Importing non-relational data, such as XML and JSON documents, web services, NoSQL, and Hadoop data
• The importance of an integrated business glossary and centralization of metadata specifications
Improving the Query Performance of Data Virtualization Servers
• How does caching really work
• Which virtual tables should be cached?
• Query optimization techniques and the explain feature
• Smart drivers/connectors can help improve query performance
• How can SQL-on-Hadoop engines speed up query performance?
• Working with multiple data virtualization servers in a distributed environment to minimize network traffic
Migrating to a Logical Data Warehouse
• An A to Z roadmap
• Guidelines for the development of a logical data warehouse
• Three different methods for modelling: outside-in, inside-out, and middle-out
• The value of a canonical data model
• Considerations for security aspects
• Step by step dismantling of the existing architecture
• The focus on sharing of metadata specifications for integration, transformation, and cleansing
Self-Service BI and the Logical Data Warehouse
• Why self-service BI can lead to “report chaos”
• Centralizing and reusing metadata specifications with a logical data warehouse
• Upgrading self-service BI into managed self-service BI
• Implementing Gartner’s BI-modal environment
Big Data and the Logical Data Warehouse
• New data storage technologies for big data, including Hadoop, MongoDB, Cassandra
• The appearance of the polyglot persistent environment; or each application its own optimal database technology
• Design rules to integrate big data and the data warehouse seamlessly
• Big data is too “big” to copy
• Offloading cold data with a logical data warehouse
Physical Data Lakes or Virtual Data Lakes?
• What is a Data Lake?
• Is developing a physical Data Lake realistic when working with Big Data?
• Developing a virtual Data Lake with data virtualization servers
• Can the logical Data Warehouse and the virtual Data Lake be combined?
Implementing Operational BI with a Logical Data Warehouse
• Examples of operational reporting and operational analytics
• Extending a logical data warehouse with operational data for real-time analytics
• “Streaming” data in a logical data warehouse
• The coupling of data replication and data virtualization
Making Data Vault more Flexibile with a Logical Data Warehouse
• What exactly is Data Vault?
• Using a Logical Data Warehouse to make data in a Data Vault available for reporting and analytics
• The structured SuperNova design technique to develop virtual data marts
• SuperNova turns a Data Vault in a flexible database
The Logical Data Warehouse and the Environment
• Design principles to define data quality rules in a logical data warehouse
• How data preparation can be integrated with a logical data warehouse
• Shifting of tasks in the BICC
• Which new development and design skills are important?
• The impact on the entire design and development process
Concluding Remarks

Who It’s For

This course is intended for everyone who needs to be aware of developments in the field of business intelligence and data warehousing, such as:
Business Intelligence Specialists
Data Analysts
Data Warehouse Designers
Business Analysts
Data Scientists
Technology Planners
Technical Architects
Enterprise Architects
IT Consultants
IT Strategists
Systems Analysts
Database Developers
Database Administrators
Solutions Architects
Data Architects
IT Managers

Kouluttaja:

Jaa:

Tyrehdytä tiedonjanosi!

Uutiskirjeen tilaajana saat ajankohtaista tietoa datan hyödyntämisestä, tekoälystä sekä muista ajankohtaisista aiheista tiedohallinnan maailmasta. Olemme data-alan johtava kouluttaja ja konsultti, joten saat tietoa suoraan kentältä. Saat samalla myös parhaat tarjoukset kansainvälisten huppuasiantuntijoiden valmennustilaisuuksiin.