Our unique 5-day instructor-led Informatica PowerCenter training bootcamp introduces experienced data integration teams to Informatica PowerCenter through lecture and hands-on labs. The class is designed for experienced data integration / ETL / ELT developers who are switching to Informatica PowerCenter and have worked with other Data Integration services tools such as Microsoft SSIS, Oracle (ODI), IBM InfoSphere DataStage, SAP BusinessObjects Data Integrator / Data Services, Talend, Dell Boomi just to name a few.

The training combines the PowerCenter Level 1 and PowerCenter Level 2 Developer classes into a unique 5-day bootcamp which enables your team to use the new tool with confidence.  Our team delivers this training at your office or via virtual classroom. Materials and Environment provided. Exist Management LLC (ExistBI) is an Authorized Informatica Training, Consulting and Systems Integration Partner.

Agenda

Module 0: PowerCenter 10 Overview

Module 1: PowerCenter 10 Architecture

  • Describe the components of the Informatica PowerCenter 10 architecture and define key terms
  • Describe PowerCenter’s optional and built-in high availability features

Module 2: Parameter Files

  • Ascertain the use of the IsExprVar property in a mapping.
  • Determine the structure of a parameter file.
  • Establish the use of parameter files in mappings and sessions
  • Describe the flexibility of using parameter files to build mapping expression logic.
  • Describe the use of a date/time mapping variable, in a parameter file for incremental loading

Module 3: User-Defined and Advanced Functions

  • Describe and implement advanced functions
  • Describe User-Defined functions
  • Create a public, User-Defined Function to create a standard name formatting function and implement the UDF in the mapping.
  • Use the AES_Encrypt and Encode functions to encrypt and encode customer data before writing it to flat file.
  • Debug the mapping using an existing session and observe the results

Module 4: Pivoting Data

  • Describe the use of a Normalizer transformation to normalize data
  • Describe the use of an Aggregator to denormalize data
  • Normalize data into a relational table
  • Denormalize data into a Fact table.

Module 5: Dynamic Lookups

  • Define Dynamic Lookup
  • Describe the Dynamic Lookup Cache
  • Use a Dynamic Lookup to load data into a dimension table.
  • Use a Dynamic Lookup in tandem with an Update Strategy transformation to keep historic data in a dimension table

Module 6: Stored Procedure and SQL Transformations

  • Call a SQL stored procedure from a PowerCenter mapping
  • Create and configure a SQL transformation in script mode.
  • Create and configure a SQL transformation in query mode.
  • Use a SQL transformation to create tables on an “as needed” basis.
  • Enter a properly formatted query into a SQL transformation.
  • Locate database errors in the result output of a SQL transformation.

Module 7: Troubleshooting Methodology and Error Handling

  • Design error handling strategies appropriate for the intended purpose of a workflow
  • Identify data errors and load them to an error table.
  • Describe Update Strategies

Module 8: Transaction Processing

  • Describe PowerCenter source-based, target-based, and user-based transaction control with and without the high availability option
  • Describe constraint-based loading in databases with referential integrity constraints
  • Load data to a set of tables with a RDBMS Primary-Foreign key relationship

Module 9: Transaction Control Transformation

  • Describe the use of the transaction control transformation for data-driven transaction control
  • Control when data is committed to disk or the target database
  • Use a transformation variable to create a flag that determines when to commit data to the RDBMS based upon data values

Module 10: Recovery

  • Describe workflow and task recovery with and without the high availability option
  • Recover tasks and workflows that stop, abort, or terminate
  • Verify that workflow recovery works in a consistent, reliable manner.

Module 11: Command Line Programs

  • Describe PMCMD, PMREP, and INFACMD command line functionality
  • Build batch files that use PMCMD and PMREP command line programs
  • Use the command line utilities to execute a variety of platform status, query, object export, and workflow tasks

Module 12: Performance Tuning Methodology

  • Isolate source, target and engine bottlenecks
  • Interpret the performance counters
  • Tune different types of bottlenecks
  • Run a benchmark test
  • Run a target bottleneck test
  • Evaluate the results

Module 13: Performance Tuning Mapping Design

  • Apply best practices in your mappings to optimize performance
  • Locate session properties that can unnecessarily lower performance.
  • Inspect and edit mappings for optimal performance design.
  • Inspect and edit transformations for optimal performance design

Module 14: Memory Optimization

  • Tune session-level memory
  • Tune transformation caches
  • Calculate how much memory a session uses
  • Become familiar with PowerCenter Performance Counters
  • Edit session memory limits
  • Edit transformation cache memory properties
  • Calculate memory cache sizes for transformations

Module 15: Performance Tuning: Pipeline Partitioning

  • Apply partition points to efficiently utilize your CPU
  • Partition your data to efficiently utilize your CPU
  • Distribute your partitioned data to preserve functionality while optimizing your CPU
  • Optimize your memory usage according to your partitioning strategy
Print Friendly, PDF & Email