Course Description
This 3-day Informatica big data integration training teaches students how to use Informatica to design and develop data integration mappings that run on Hadoop. The course includes Hadoop ingestion, offloading and execution of data integration mappings natively in Hadoop. This Informatica big data training also shows how to optimize data warehouse processing in Hadoop environments.
The labs in this big data course take you from using PowerCenter to Developer tool to populate Hadoop data stores, to running those mappings in Hadoop. Complex data files are also defined, and parsed in Hadoop.
Course Outcomes
- Describe data warehouse optimization in Hadoop environments
- Describe license capabilities included in Informatica Big Data Editions
- Offload data on Hadoop using Informatica PowerExchange for Hadoop
- Offload processing (workloads) into Hadoop using Informatica Developer Tool
- Process file types in Hadoop that cannot be processed in traditional DataWarehouse setting. Process complex binary files E.G Web Logs and Call Detail Records
- Describe optimum mapping design methods when executing Informatica mappings in Hadoop
- Read and Write MongoDB both in Relational and JSON forms
Course Summary
Next Public Course Dates | |
Prerequisites |
|
Duration |
|
Available Formats |
|
Audience |
|
Testimonials
Hands-down the best technical training I have taken! The methods of progressively building reports, adding complexity and features as time goes on, was fantastic.
, Cirrus Aircraft
Absolutely loved the enthusiasm and appreciate the knowledge he brought to class!!!
, KPERS
The course was comprehensive and interactive, and the trainer really understood the content. I have been able to implement much of what I learned into my daily work activities and have saved a lot of time.
, Advocate Health Care