Sr. Data Engineer - nDimensionsIT
charlotte, NC
About the Job
Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} Location: Charlotte Mode: Hybrid (Need to be in office 3 days a week). Note : They are looking for only Green Card and US citizens on the below requirement as the clients want to hire the consultant full time after few months. We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. As a Senior Data Engineer, you will play a key role in designing, building, and maintaining data pipelines and infrastructure, utilizing your expertise in Python, Snowflake, dbt (Data Build Tool), Airflow, and data modeling and ELT (Extract, Load, Transform) concepts. Key Responsibilities: -Data Pipeline Design and Development: Design, develop, and maintain scalable and robust data pipelines using Python, Snowflake, and dbt, ensuring efficient data extraction, transformation, and loading processes. -Workflow Orchestration: Implement and manage workflow orchestration using Airflow to schedule, monitor, and automate data processing tasks and workflows. -Data Modeling: Apply advanced data modeling techniques to design and optimize data schemas and structures, ensuring data integrity, performance, and scalability. -ELT Implementation: Implement ELT (Extract, Load, Transform) processes to efficiently transform and load data into Snowflake data warehouse, leveraging dbt for data transformations. -Performance Optimization: Identify and implement performance optimization strategies to enhance the speed, efficiency, and reliability of data pipelines and processing workflows. -Collaboration and Mentorship: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to understand data requirements and deliver effective data solutions. Mentor junior team members and provide technical guidance and support as needed. -Continuous Improvement: Stay updated with the latest technologies, tools, and best practices in data engineering, and drive continuous improvement initiatives to enhance data engineering processes and capabilities. Qualifications: -Bachelor's or Master's degree in Computer Science, Engineering, or related field. -5+ years of experience in data engineering roles, with a strong focus on building data pipelines and infrastructure. -Proficiency in Python programming language for data manipulation, scripting, and automation. -Extensive experience with Snowflake data warehouse platform, including data modeling, SQL development, and performance tuning. -Hands-on experience with dbt (Data Build Tool) for data transformation and modeling. -Experience with workflow orchestration tools such as Airflow for scheduling and monitoring data workflows. -Deep understanding of data modeling principles and techniques, including dimensional modeling and schema design. -Strong knowledge of ELT (Extract, Load, Transform) concepts and implementation strategies. -Excellent problem-solving skills and the ability to troubleshoot and resolve complex data engineering issues. -Strong communication and collaboration skills, with the ability to work effectively in a team environment.
Source : nDimensionsIT