Cloud Database Engineer from Collabera
Phoenix, AZ
About the Job
Day-to-Day:
- Design and develop data pipelines using GCP Dataflow for processing high-volume financial data (millions of records).
- Utilize Airflow to orchestrate and schedule data pipeline execution within the GCP environment.
- Migrate on-premise data sources to GCP and integrate data pipelines with internal applications.
- Write efficient Python and potentially some Java code to manipulate and transform data within the pipelines.
- Collaborate with engineers and data analysts to understand data requirements and translate them into technical solutions.
- Leverage BigQuery for data warehousing and analysis of processed data.
- Monitor and troubleshoot data pipelines to ensure smooth operation, data quality, and timely delivery.
- Maintain and enhance existing data pipelines to improve efficiency and scalability.
- Stay up-to-date with the latest advancements in GCP, Airflow, and GCP best practices.
Must Have:
Must Haves: 7+ Years of experience
- Proven experience in developing data pipelines using GCP Dataflow.
- Experience with Apache Airflow for workflow orchestration and scheduling.
- Familiarity with BigQuery for data warehousing and querying.
- Proficiency in SQL for data manipulation and querying.
- Working knowledge of Python scripting for data processing tasks.
- Basic understanding of (GCP) concepts
NiceToHave:
Nice to Haves:
- Experience integrating data pipelines with internal applications
- Experience working in a cloud environment (AWS, Azure, or GCP)
- Experience with migrating on-premise data to cloud platforms (AWS, Azure, or GCP)
- Previous experience working in Banking or finanace
Salary
50 - 60 /hour