Data Engineer - Tek Ninjas
Chicago, IL 60632
About the Job
Position: Data Engineer
Location: Chicago, IL (Hybrid)
Duration: 9 Months with EXT
Top 5 Skill sets
1. DevOps
2. AWS Cloud
3. Terraform
4. Python
5 CI/CD pipelines
Nice to have skills or certifications:
1. Blue-Green deployments
2. Kubernetes
3. Ansible Playbooks JOB OVERVIEW AND RESPONSIBILITIES
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in Uniteds data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.
• Support large scale data pipelines in a distributed and scalable environment
• Enable and optimize production AWS environment for data infrastructure and frameworks
• Expert in creating Terraform modules to automate deployments
• Knowledge of Databricks and Datalake technologies
• Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities
• Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies
• Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues
• AWS Certification
• Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databricks
• Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking
• Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement
• Must be fluent in English (written and spoken)
• Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners
• Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels
• Manage agile development and delivery by collaborating with project manager, product owner and development leads
REQUIRED
• Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)
• 5+ years of experience in data engineering or ETL development role
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with structure, semi- structure, and unstructured datasets.
• Experience with Big Query, SQL server, etc.
• Experience with AWS cloud services: Redshift, S3, Athena, etc.
• Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc.
• Passionate about solving problems through data and analytics, and creating data products including data models
• Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work
• Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner
• Proven proficiency with Microsoft Excel and PowerPoint
• Strong problem-solving skills, using data to tackle problems
• Outstanding writing, communication, and presentation skills
PREFERRED
• Master's degree
• Experience with Quantum Metrics and Akamai
• Experience with languages: Python, R, etc.
• Strong experience with continuous integration & delivery using Agile methodologies
• Data engineering experience with transportation/airline industry
• Strong problem-solving skills
1. DevOps
2. AWS Cloud
3. Terraform
4. Python
5 CI/CD pipelines
Nice to have skills or certifications:
1. Blue-Green deployments
2. Kubernetes
3. Ansible Playbooks JOB OVERVIEW AND RESPONSIBILITIES
In this role you will partner with various teams to define and execute data acquisition, storage, transformation, processing and make data actionable for operational and analytics initiatives that create sustainable revenue and share growth. This role requires expertise in Uniteds data sources and technology business intuition, and a working knowledge of data transformation and analytical tools.
• Support large scale data pipelines in a distributed and scalable environment
• Enable and optimize production AWS environment for data infrastructure and frameworks
• Expert in creating Terraform modules to automate deployments
• Knowledge of Databricks and Datalake technologies
• Partner with development teams and other department leaders/stakeholders to provide cutting edge technical solutions that enable business capabilities
• Participate and lead in design and development of innovative batch and streaming data applications using AWS technologies
• Provide the team technical direction and approach to be undertaken and guide them in resolution of queries/issues
• AWS Certification
• Knowledge: Python, Bash scripting, PySpark, AWS Services (Airflow, Glue, Lambda, others), Terraform, Databricks
• Skills: Thorough troubleshooter, Hands on AWS Technology leader, People person and ability to conclude an undertaking
• Ability: Solve problems under pressure and in constrained scenarios, Leadership, Making right judgement
• Must be fluent in English (written and spoken)
• Coordinate and guide cross-functional projects that involve team members across all areas of the enterprise, vendors, external agencies, and partners
• Ability to manage multiple deliverables both short and long-term in a busy and aggressive environment, while staying flexible to dynamic needs and priority levels
• Manage agile development and delivery by collaborating with project manager, product owner and development leads
REQUIRED
• Bachelor's degree in quantitative field (statistics, software engineering, business analytics, information systems, aviation management or related degree)
• 5+ years of experience in data engineering or ETL development role
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with structure, semi- structure, and unstructured datasets.
• Experience with Big Query, SQL server, etc.
• Experience with AWS cloud services: Redshift, S3, Athena, etc.
• Experience with SQL and various database interface tools: SSMS, Oracle SQL developer, etc.
• Passionate about solving problems through data and analytics, and creating data products including data models
• Strong initiative to take ownership of data-focused projects, get involved in the details of validation and testing, as well as provide a business user perspective to their work
• Ability to communicate complex quantitative concepts in a clear, precise, and actionable manner
• Proven proficiency with Microsoft Excel and PowerPoint
• Strong problem-solving skills, using data to tackle problems
• Outstanding writing, communication, and presentation skills
PREFERRED
• Master's degree
• Experience with Quantum Metrics and Akamai
• Experience with languages: Python, R, etc.
• Strong experience with continuous integration & delivery using Agile methodologies
• Data engineering experience with transportation/airline industry
• Strong problem-solving skills
Source : Tek Ninjas