REMOTE Position - Looking for Data Engineer(GCP) - Immediate Start - W2 position at Saanvi Technologies
Dearborn Hts, MI 48127
About the Job
REMOTE Position - Looking for Data Engineer(GCP) - Immediate Start - W2 position
W2 Contract- H1B Sponsorship Provided!!
H1B Processing
Various EAD Support
Immediate GC and I140 Processing
Hybrid/Remote
Job Description:
This role requires a blend of technical expertise, collaboration, and innovation to support near real-time analytics, build data products, and ensure data quality and availability. Collaborate with the Manufacturing & Quality Analytics Team, IT, Data Tech, and external solution partners to develop and deliver the Data Factory. Contribute to data standards, interoperability, quality, and availability to support GDI&A goals. Design and implement data engineering and streaming solutions for real-time analytics. Build analytical model objects and data products to support scalable growth and accelerate business value delivery. Develop data pipelines for cloud solutions, focusing on IIOT & Quality projects. Master various data sources and contribute to the creation of the Data Discovery Hub. Perform ETL activities and assist with customer inquiries and incident resolution. Lead integration projects and liaise with business customers on status updates. Optimize code design within/across teams for improved performance.
W2 Contract- H1B Sponsorship Provided!!
H1B Processing
Various EAD Support
Immediate GC and I140 Processing
Hybrid/Remote
Job Description:
This role requires a blend of technical expertise, collaboration, and innovation to support near real-time analytics, build data products, and ensure data quality and availability. Collaborate with the Manufacturing & Quality Analytics Team, IT, Data Tech, and external solution partners to develop and deliver the Data Factory. Contribute to data standards, interoperability, quality, and availability to support GDI&A goals. Design and implement data engineering and streaming solutions for real-time analytics. Build analytical model objects and data products to support scalable growth and accelerate business value delivery. Develop data pipelines for cloud solutions, focusing on IIOT & Quality projects. Master various data sources and contribute to the creation of the Data Discovery Hub. Perform ETL activities and assist with customer inquiries and incident resolution. Lead integration projects and liaise with business customers on status updates. Optimize code design within/across teams for improved performance.
Skills Required:
- Google Cloud Platform - Data Engineering - Data Warehousing - Data Modeling - RDBMS - SQL - Python - ETL
- Google Cloud Platform - Data Engineering - Data Warehousing - Data Modeling - RDBMS - SQL - Python - ETL
Skills Preferred:
- PubSub - Data Form - Data Flow - DBT - Airflow - Astronomer - Kafka
- PubSub - Data Form - Data Flow - DBT - Airflow - Astronomer - Kafka
Experience Required:
3+ Years of experience with Data Engineering 3+ years of experience with data warehousing. 3+ years of experience with Google Cloud Platform, RDBMS, SQL, Hadoop Ecosystem. 2+ years of experience with Python, Apache Spark. 2+ years of experience in tuning and query optimization.
3+ Years of experience with Data Engineering 3+ years of experience with data warehousing. 3+ years of experience with Google Cloud Platform, RDBMS, SQL, Hadoop Ecosystem. 2+ years of experience with Python, Apache Spark. 2+ years of experience in tuning and query optimization.
Experience Preferred:
Familiarity with GCP tools (Big Query, Cloud Storage, PubSub, DataFlow, etc.). Experience with Alteryx, Tableau, Looker. Excellent critical thinking, proactive decision-making, and communication skills. Experience in coordinating data landing activities. Strong collaboration and team leadership abilities. Knowledge of machine learning concepts and models.
Familiarity with GCP tools (Big Query, Cloud Storage, PubSub, DataFlow, etc.). Experience with Alteryx, Tableau, Looker. Excellent critical thinking, proactive decision-making, and communication skills. Experience in coordinating data landing activities. Strong collaboration and team leadership abilities. Knowledge of machine learning concepts and models.
Education Required:
Bachelor's degree in Computer Science, Computer Engineering, Analytics, or a related field.
Bachelor's degree in Computer Science, Computer Engineering, Analytics, or a related field.
Education Preferred:
Master's degree inin Computer Science, Computer Engineering, Analytics, or a related field.
Master's degree inin Computer Science, Computer Engineering, Analytics, or a related field.