Course Description
This 3-day Informatica big data integration training teaches students how to use Informatica to design and develop data integration mappings that run on Hadoop. The course includes Hadoop ingestion, offloading and execution of data integration mappings natively in Hadoop. This Informatica big data training also shows how to optimize data warehouse processing in Hadoop environments.
The labs in this big data course take you from using PowerCenter to Developer tool to populate Hadoop data stores, to running those mappings in Hadoop. Complex data files are also defined, and parsed in Hadoop.
Course Outcomes
- Describe data warehouse optimization in Hadoop environments
- Describe license capabilities included in Informatica Big Data Editions
- Offload data on Hadoop using Informatica PowerExchange for Hadoop
- Offload processing (workloads) into Hadoop using Informatica Developer Tool
- Process file types in Hadoop that cannot be processed in traditional DataWarehouse setting. Process complex binary files E.G Web Logs and Call Detail Records
- Describe optimum mapping design methods when executing Informatica mappings in Hadoop
- Read and Write MongoDB both in Relational and JSON forms
Course Summary
Next Public Course Dates | |
| Prerequisites |
|
| Duration |
|
| Available Formats |
|
| Audience |
|
Testimonials
“The content was great for an introduction to Tableau. I truly appreciate the way Michael was able to explain everything, and would recommend this class to anyone needing to get their feet wet with Tableau.”
- Shianne DeGeorge, Business Intelligence Group, Kaiser Permanente
“Steve was a great instructor and was able to articulate the course material in a way that promoted learning.”
- Ben Ruddock, Analyst, GSP International Airport
“The Informatica Cloud Data Integration for Developers training class was excellent, the teacher was very knowledgeable and the materials were very useful”
- James Martin, Data Engineer, RoyOMartin

























