Senior Data Engineer at vastek inc
dallas, TX
About the Job
Looking for only GC & USC Candidates over this Position
Position Summary
The Senior Data Engineer will work with our data management team. This role focuses on leveraging advanced technologies such as Databricks, Delta Lake, FiveTran, and Google Cloud Platform (GCP) to enhance our data infrastructure and drive innovative data solutions. Utilizing experience with Databricks administration, Unity Catalog, and JAR file management to optimize our data processes and support advanced analytics initiatives.
Required Knowledge:
- General knowledge of advanced concepts, practices and procedures related to design and implementation of robust data pipelines
- Proven expertise in Databricks administration and managing large-scale data environments.
- Strong experience with SQL, Python, and other scripting languages commonly used in data engineering.
- Familiarity with Fivetran or similar data integration tools and understanding of ETL processes
- Previous experience managing end-to-end data workflows
Required Experience:
- Minimum of five (5) years of experience [nine (9) years for non-degreed candidates] in a data engineering role with significant exposure to Databricks, Delta Lake, and cloud platforms like GCP.
Preferred Experience:
- Experience with Unity Catalog and managing data access and security within a Databricks environment.
- Knowledge of Java Archive (JAR) files and their applications in data projects.
Required Skills:
- Must have strong organizational skills
- Must have a detail orientation and the proven ability to prioritize work FFF/Nufactor: Internal Use Only
- Must have effective verbal and written communication skills
- Must have the ability to work with limited supervision and as part of a team
- Sound decision-making abilities
- Excellent problem-solving skills and the ability to work in a dynamic, fastpaced environment.
- Strong communication and collaboration skills, capable of working effectively across multiple teams.
Preferred Skills:
- Experience with machine learning and AI workflows.
- Familiarity with Apache workflows, such as Apache Airflow or Apache NiFi, showcasing the ability to design, implement, and manage complex data pipelines and workflows. • Knowledge of data visualization and reporting tools (e.g., Power BI, Tableau). Professional Certification:
- Certifications in data engineering or a related field a plus.