Watch DataStage Demo Video
- Trainer Information
Course : Datastage Px (9.1 and above)
- Trainer has 9 years of IT & Datastage Experience
- Experienced in Finance, HealthCare, Life Science domains
- 3+ Years of experience as Trainer in Datastage
- Overall 100+ Batches
- We will provide VM Software of Datastage to be used.
- No Installation required to start the Hands On. Necessary setups already done.
- Regular Assignments in the Classes.
- Real Time Scenarios to be discussed in Class.
- Extra Scenarios will be provided for practice.
Services and Study Materials
- Daily recorded videos will be shared.
- Study materials on Datastage & Interview Questions will be provided.
- Will provide important interview questions and precise answers while discussing corresponding topics in class.
- Core Concepts of Datastage 9.1 Online Course
- Data Warehousing lifecycle
- Data modeling concepts
- DataStage architecture
- DataStage administrator client component overview
- Designer client components
- Complex jobs and Sequence jobs development
- Parameterization and Parameter set
- DataStage Director
DataStage Course Curriculum
- Course Duration : 35 Hours
- Data warehousing fundamentals
- Data Modeling
- ETL Design
- DataStage Installation
- DataStage Components
- DataStage 9.1 Introduction
- DataStage Administrator
- DataStage Director
- DataStage Designer
- Parallel Palette
- Parallel Job Stages
- Job Sequencers
- Information Analyzer
- IBM WebSphere Quality Stage
- IBM Information Server
- Key Service
Datastage training can justify the ideas of DataStage Enterprise Edition, its design and the way to use this to ‘real life’ situations in an exceedingly business case-study during which you may solve business issues.
Datastage online training begin by viewing the massive image and discuss why businesses need ETL tools and where Datastage fits within the product set.
Once we’ve talked concerning the terribly basic design of DataStage training, using the DataStage clients, you may log onto the DataStage server, produce a Project, import Meta knowledge, and make a DataStage Job. You may then compile and run the duty and appearance at the logs. Once collateral that your Job can run on one node in associate SMP environment, you may then put associated partition the information to be ready to run on multiple nodes at the same time in an MPP, Cluster, or maybe a Grid atmosphere.
DataStage Job Trends