Senior Data Engineer II - Phaxis LLC
Aurora, IL 60101
About the Job
The Senior Data Engineer II will work with the Analytics and Data Engineering team to build a scalable data delivery platform to drive the use and adoption of a distributed data technology using a Cloud environment. The Sr Data Engineer will help deploy distributed technologies, automate technologies, upgrade paths, and monitor production implementation of healthcare data platform. The position is responsible for defining, developing and maintaining the modernization of data architecture, ETL processes, and data platform and solutions.
There is opportunity to exercise freedom in building the foundation of this work and to be creative. Our Data Engineer will be someone who likes to build things and solutions ground up, and loves the opportunity to work with data warehousing stack modern technologies.
ESSENTIAL JOB FUNCTIONS AND DUTIES
- Drives research, design, and development of a secure, resilient, and self-healing data architecture foundation, including data warehouse/mart, data integration pipelines, and data-specific software components.
- Works alongside the Principal Data Architect to define and implement best practices, standards, and processes for development, analysis, testing, and tuning of big data solutions.
- Leads the creation, maintenance, and optimization of ELT data pipelines from development to production.
- Develops and enforces data integration and data quality standards across all development initiatives, adhering to the organization's information services policies and best practices.
- Executes and oversees the analysis and remediation of root causes, including technological, procedural, or resource capability deficiencies.
- Operates in an agile model alongside architects, data engineers, data scientists, data analysts, business partners, and other developers in the delivery of data solutions.
- Provides technical guidance and mentorship to less experienced Data Engineers, fostering a culture of education and skill development.
- Introduces and helps to understand new technologies to solve business problems, creating relevant prototypes where appropriate.
- Design, build, and maintain modern cloud-based data platforms using technologies such as AWS, GCP, or Azure.
- Ensures the ethical use, safety, and privacy of UHH and customer/patient data.
- Design, build, and maintain scalable and efficient data pipelines and architectures.
- Align systems with business goals and industry standard methodologies, ensuring data integrity and accessibility.
- Maintains and enhances Data Warehouse ETL, management, data quality and analytics processes.
- Interfaces with users and management regarding requirements, testing, and implementation.
ESSENTIAL QUALIFICATIONS
- 5 - 7 years' of direct data engineering experience, ideally with healthcare benefits management organization.
- Must have experience doing development work involving medical claims, pharmacy claims, and eligibility data and a conceptual understanding of healthcare benefit administration.
- Demonstrated experience with a variety of relational and NoSQL technologies (e.g., Azure SQL Server, PostgreSQL, Cosmos DB).
- Experience in a cloud platform (preferably Azure) and its related technical stack (e.g., Azure Data Factory, Synapse, DBT, Fivetran, Snowflake, Databricks, dremio, Airflow, NiFi).
- Extensive Azure Data Technology design and implementation experience: ADF, Azure, SQL, Azure Data Bricks, Azure Analysis Services, Data Lakes and Power BI.
- Strong technical understanding of data modeling, data mining, master data management, data integration, data architecture, data virtualization, data warehousing, and data quality techniques.
- Strong knowledge in SQL, modern programming languages (e.g., Python, R), and common data pipeline/data science libraries.
- Experience with Git repositories, CI/CD (preferably Azure DevOps), and software development tools, including incident tracking, version control, release management, and testing tools.
- Experience working with data governance and data security specifically in moving data pipelines into production with appropriate data quality, governance, and security standards, and certification.
- Adept in agile methodologies and capable of applying DevOps principles and Data Operations practices to data pipelines.
- Knowledge of CI/CD processes and source control tools such as GitHub and related dev processes.
- Experience with Snowflake and utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time Travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
- In-depth understanding of Data Warehouse/ODS, ETL concept and modelling structure principles, Data warehousing - OLTP, OLAP, Dimensions, Facts and Data modelling.
- Familiarity with healthcare and security regulatory standards (e.g., HIPAA, CCPA).
- Strong soft skills, including effective communication and stakeholder management.
- Experience with Healthcare EDI transactions (837, 835, etc.) and/or Lab Data strongly preferred