GCP Developer (Hadoop and Hive) at Tech Mahindra
Dearborn, MI 48124
About the Job
Job Title: GCP Developer with Hadoop / Hive
Location: Dearborn MI (FTE)
Year of Experience: 5 to 7 yrs.
Handle complex migrations from legacy Teradata warehousing solutions or on-prem Hadoop/Hive to Big Query in GCP
Implement Cloud based solutions based on various GCP services and technologies such as BigQuery, Dataflow, Dataproc, Bigtable, Dataform, Data fusion, cloud spanner, Cloud SQL, Cloud functions and Cloud scheduler.
Location: Dearborn MI (FTE)
Year of Experience: 5 to 7 yrs.
Job Description
Develop a comprehensive migration strategy for our data infrastructure and workloads to GCP.Handle complex migrations from legacy Teradata warehousing solutions or on-prem Hadoop/Hive to Big Query in GCP
Implement Cloud based solutions based on various GCP services and technologies such as BigQuery, Dataflow, Dataproc, Bigtable, Dataform, Data fusion, cloud spanner, Cloud SQL, Cloud functions and Cloud scheduler.
GCP Infrastructure as Code Terraform Development Deployment through Tekton pipeline GitHub Python
Skills Preferred:
Cloudera Hadoop/Hive Rally Agile Framework
Tech Mahindra is an Equal Employment Opportunity employer. We promote and support a diverse workforce at all levels of the company. All qualified applicants will receive consideration for employment without regard to race, religion, color, sex, age, national origin or disability. All applicants will be evaluated solely on the basis of their ability, competence, and performance of the essential functions of their positions