Data Engineer - LER TechForce
Columbus, IN 47201
About the Job
Data Engineer
Are you looking for a role where you can use your knowledge of big data pipelines to make a difference in the automotive industry? Would you like to work for a company that provides an innovative work environment, flexible schedules, and ongoing professional development?
Who we are:
LER TechForce, an LHP affiliate, previously LHP Engineering Resources is an industry leader in embedded controls, software, functional safety, and engineering IT talent. For over 20 years LER has been working with customers across North America to meet their engineering resource challenges.
We have a position for an experienced data engineer to work on Big Data pipelines (Ingest-Transform-Deliver).
Job Location: Anywhere in the U.S. or Mexico and work Eastern Standard Time zone hours.
What you will be doing:
- Support ongoing Big Data initiatives
- Big Data Pipeline setup guidelines
- Source Data Identification and analysis
- Data quality research and mitigation
- Design and development
- Data Ingestion into the Data Lake
- Data curation, aggregation effort
- Co-ordinate code promotion with the appropriate support groups
- Onboard, train new members to team
- Work with Advanced Analytics Data Engineering team to document the process for business users to consume the data and big data system, and provide training as needed.
- Agile Story Planning
- Identify business requirements
- Define business process flow and benefits
- Prioritized Functional and non-functional requirements
- High Level Source systems analysis
- POC Production Solution Implementation for key data and analytics need
- Technical /System Requirements
- High Level Architecture
- Architecture Recommendation
- Conceptual Data model
- Data Ingestion, Curation Flow
- Reporting & Analytics Design
- Process to pull data into Lake
- Data Interface requirements
- Data quality requirements
- Solution implementation and development
The ideal candidate will be knowledgeable in the following areas:
- Big Data Pipelines
- Azure Data Services
- Azure Databricks
- Azure API
- Programming languages across Scala, Python, Pyspark, NodeJS and Express
What you’ll get:
- Full benefits: medical, dental, 401K match
- Ongoing professional development opportunities
- Flexible Hybrid schedule
- The opportunity to work on industry leading projects
What you’ll need to be successful:
- College, university, or equivalent Bachelor's degree in Engineering or other relevant technical disciplines. Degree Programs Considered: Bachelor's, Master's, PhD.
- 2-7 Years of experience working on Big Data pipelines (Ingest-Transform-Deliver) with specific hands-on experience with:
- Azure Data Services (Data Lake, Data Factory, SQL DB, Synapse, Cosmos DB)
- Azure Databricks (Knowledge of clusters, Using Job, Interactive, Streaming clusters
- Azure API Services (Web Apps, knowledge of Azure AD, Azure Load balancer, Azure Functions)
- Programming languages across Scala, Python, Pyspark, NodeJS and Express
- CI/CD and Test Automation of Big data pipelines using BDD/TDD frameworks and tools like Qtest, Selenium etc.
- Humble, teachable, and who solve their own problems
- Effective and collaborative team player
- Good communicator – written and verbal
- Great collaborator
Click the Easy Apply button to learn more.
#LI-NC1