IBM Infosphere DataStage - Tekfortune Inc.
Plano, TX
About the Job
Job Role: IBM Infosphere DataStage
Location: Plano, TX (5 days work from office, from Day1 onsite joining require, no exception)
Job Description
Mandatory required skills: IBM Infosphere DataStage Suite.
1. Should know ERWIN data Modeling Tool design the logical/physical models.
2. Develop and maintain data dictionaries and data models.
3. ETL/Database concepts and Snowflake concepts is added advantage.
" Should have 5 to 8 years of experience in ETL Design and Development using IBM DataStage Components.
" Should have worked at least 3 years in DataStage 9.1 or later.
" Should have extensive knowledge in Python.
" Should have extensive knowledge in Unix shell scripting.
" Thorough Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
" Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
" Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
" Should have worked either Insurance or Banking domain.
" Should have extensive working experience with different Data Sources like Netezza, Oracle Databases, snowflake, DB2 etc.
" Should be very good in writing complex SQL queries.
" Experience in translating functional or non-functional requirements to system requirements.
" Exposure to scheduling tools like BMC Control-M.
" Experience in agile methodologies is desirable.
" Exposure to Big Data technologies will be an added advantage.
" End to end understanding of source->ETL->Application layer->Reporting.
" Must have good Oral, written & presentation skills to interact with business & technical teams on day-to-day basis as it s a client facing role.
" Should be self-driven and able to run the show with minimal or no assistance.
" Strong experience on building data pipelines on the cloud, on-prem preferably on AWS
" Experience with large scale analytical solutions on Snowflake in-depth experience with Snowflake architecture, optimization levers, cost management principles, etc.
" Strong knowledge in DW Concepts, Dimensional Modeling, SCD2, Data modeling, ETL/ELT, Data, and Information Management.
" Strong in RDBMS and SQL Knowledge.
Location: Plano, TX (5 days work from office, from Day1 onsite joining require, no exception)
Job Description
Mandatory required skills: IBM Infosphere DataStage Suite.
1. Should know ERWIN data Modeling Tool design the logical/physical models.
2. Develop and maintain data dictionaries and data models.
3. ETL/Database concepts and Snowflake concepts is added advantage.
" Should have 5 to 8 years of experience in ETL Design and Development using IBM DataStage Components.
" Should have worked at least 3 years in DataStage 9.1 or later.
" Should have extensive knowledge in Python.
" Should have extensive knowledge in Unix shell scripting.
" Thorough Understanding of DW principles (Fact, Dimension tables, Dimensional Modelling and Data warehousing concepts).
" Research, development, document and modification of ETL processes as per data architecture and modeling requirements.
" Ensure appropriate documentation for all new development and modifications of the ETL processes and jobs.
" Should have worked either Insurance or Banking domain.
" Should have extensive working experience with different Data Sources like Netezza, Oracle Databases, snowflake, DB2 etc.
" Should be very good in writing complex SQL queries.
" Experience in translating functional or non-functional requirements to system requirements.
" Exposure to scheduling tools like BMC Control-M.
" Experience in agile methodologies is desirable.
" Exposure to Big Data technologies will be an added advantage.
" End to end understanding of source->ETL->Application layer->Reporting.
" Must have good Oral, written & presentation skills to interact with business & technical teams on day-to-day basis as it s a client facing role.
" Should be self-driven and able to run the show with minimal or no assistance.
" Strong experience on building data pipelines on the cloud, on-prem preferably on AWS
" Experience with large scale analytical solutions on Snowflake in-depth experience with Snowflake architecture, optimization levers, cost management principles, etc.
" Strong knowledge in DW Concepts, Dimensional Modeling, SCD2, Data modeling, ETL/ELT, Data, and Information Management.
" Strong in RDBMS and SQL Knowledge.
Source : Tekfortune Inc.