Database Administrator 2 - Smart IMS. Inc
Austin, TX 78751
About the Job
Railroad Commission of Texas requires the services of 1 Developer/Programmer Analyst 3, hereafter referred to as Candidate(s), who meets the general qualifications of Developer/Programmer Analyst 3, Applications/Software Development and the specifications outlined in this document for the Railroad Commission of Texas.
All work products resulting from the project shall be considered "works made for hire " and are the property of the Railroad Commission of Texas and may include pre-selection requirements that potential Vendors (and their Candidates) submit to and satisfy criminal background checks as authorized by Texas law. Railroad Commission of Texas will pay no fees for interviews or discussions, which occur during the process of selecting a Candidate(s).
I. DESCRIPTION OF SERVICES
Railroad Commission of Texas requires the services of One (1) Informatica Data Engineer, hereafter referred to as Candidate(s), who meets the general qualifications of Applications/Software Development, Developer/Programmer Analyst, Level - 3, and the specifications outlined in this document for the Railroad Commission of Texas.
All work products resulting from the project shall be considered "works made for hire " and are the property of the Railroad Commission of Texas and may include pre-selection requirements that potential Vendors (and their Candidates) submit to and satisfy criminal background checks as authorized by Texas law. Railroad Commission of Texas will pay no fees for interviews or discussions, which occur during the process of selecting a Candidate(s).
Job Description
Performs advanced (senior-level) Data Pipeline development work with Informatica Cloud (IICS) including data integration, source to target data modeling, Extract-Transaction-Load development, consuming Oracle data connections, RESTful API application-based data connections, targeting Snowflake data connections, designing Snowflake target data lake databases, data warehouse modeling with ELT. This candidate will be using Informatica Cloud to build mass ingestion pipelines or other EL from an Oracle transaction database(s) to our Snowflake Data Lake. Additional duties may include working within Snowflake to create other API based data ingestion routines and/or setting up Data Sharing of these accumulated data. Dimensional modeling of data may be required in our Data Warehouse to accomplish other objectives like data reporting and improved performance data handling.
" Development of ETL/ELT data mappings and workflows for data pipeline development with Informatica Cloud Data Integration.
" Practical experience using and building Informatica Mass Ingestion Pipelines.
" Demonstrated experience with Oracle Database as a Data Connector source.
" Expert with Snowflake as a Target database platform.
" Experience with the Snowflake platform and ecosystem. Knowledge of Snowflake data sharing and Snowpark is a plus.
" Knowledge of the advantages as well as previous experience working with Informatica push-down optimization
" Experience with Snowflake database creation, optimization, and architectural advantages.
" Practical experience with Snowflake SQL.
" Determines database requirements by analyzing business operations, applications, and programming; reviewing business objectives; and evaluating current systems.
" Obtains data model requirements, develops, and implements data models for new projects, and maintains existing data models and data architectures.
" Creates graphics and other flow diagrams (including ERD) to show complex database design and database modeling more simply.
" Performs related work as assigned.
" Practical experience with one time data loads as well as Change Data Capture (CDC) for bulk data movement.
" Creation of technical documentation for process and interface documentation is a key element of this role, as the Team is working on release two of a multiple release effort.
" Ability to review the work of others, troubleshoot, and provide feedback and guidance to meet tight deliverable deadlines is required.
" Ability to promote code from development environments to production.
" Familiarity with GitHub or equivalent version control systems.
" Experience working with state agencies as well as security protocols and processes.
II. TERMS OF SERVICE
Services are expected to start 4/15/2024 and are expected to be completed by 8/31/2024. Total estimated hours per Candidate shall not exceed 750 hours. This service may be amended, renewed, and/or extended providing both parties agree to do so in writing.
All work products resulting from the project shall be considered "works made for hire " and are the property of the Railroad Commission of Texas and may include pre-selection requirements that potential Vendors (and their Candidates) submit to and satisfy criminal background checks as authorized by Texas law. Railroad Commission of Texas will pay no fees for interviews or discussions, which occur during the process of selecting a Candidate(s).
I. DESCRIPTION OF SERVICES
Railroad Commission of Texas requires the services of One (1) Informatica Data Engineer, hereafter referred to as Candidate(s), who meets the general qualifications of Applications/Software Development, Developer/Programmer Analyst, Level - 3, and the specifications outlined in this document for the Railroad Commission of Texas.
All work products resulting from the project shall be considered "works made for hire " and are the property of the Railroad Commission of Texas and may include pre-selection requirements that potential Vendors (and their Candidates) submit to and satisfy criminal background checks as authorized by Texas law. Railroad Commission of Texas will pay no fees for interviews or discussions, which occur during the process of selecting a Candidate(s).
Job Description
Performs advanced (senior-level) Data Pipeline development work with Informatica Cloud (IICS) including data integration, source to target data modeling, Extract-Transaction-Load development, consuming Oracle data connections, RESTful API application-based data connections, targeting Snowflake data connections, designing Snowflake target data lake databases, data warehouse modeling with ELT. This candidate will be using Informatica Cloud to build mass ingestion pipelines or other EL from an Oracle transaction database(s) to our Snowflake Data Lake. Additional duties may include working within Snowflake to create other API based data ingestion routines and/or setting up Data Sharing of these accumulated data. Dimensional modeling of data may be required in our Data Warehouse to accomplish other objectives like data reporting and improved performance data handling.
" Development of ETL/ELT data mappings and workflows for data pipeline development with Informatica Cloud Data Integration.
" Practical experience using and building Informatica Mass Ingestion Pipelines.
" Demonstrated experience with Oracle Database as a Data Connector source.
" Expert with Snowflake as a Target database platform.
" Experience with the Snowflake platform and ecosystem. Knowledge of Snowflake data sharing and Snowpark is a plus.
" Knowledge of the advantages as well as previous experience working with Informatica push-down optimization
" Experience with Snowflake database creation, optimization, and architectural advantages.
" Practical experience with Snowflake SQL.
" Determines database requirements by analyzing business operations, applications, and programming; reviewing business objectives; and evaluating current systems.
" Obtains data model requirements, develops, and implements data models for new projects, and maintains existing data models and data architectures.
" Creates graphics and other flow diagrams (including ERD) to show complex database design and database modeling more simply.
" Performs related work as assigned.
" Practical experience with one time data loads as well as Change Data Capture (CDC) for bulk data movement.
" Creation of technical documentation for process and interface documentation is a key element of this role, as the Team is working on release two of a multiple release effort.
" Ability to review the work of others, troubleshoot, and provide feedback and guidance to meet tight deliverable deadlines is required.
" Ability to promote code from development environments to production.
" Familiarity with GitHub or equivalent version control systems.
" Experience working with state agencies as well as security protocols and processes.
II. TERMS OF SERVICE
Services are expected to start 4/15/2024 and are expected to be completed by 8/31/2024. Total estimated hours per Candidate shall not exceed 750 hours. This service may be amended, renewed, and/or extended providing both parties agree to do so in writing.
Source : Smart IMS. Inc