Data Architect - Karwell Technologies
South Plainfield, NJ 07080
About the Job
Job Title: Data Architect
Location: Remote
Duration : Long Term
Skills : Azure Data Factory, Azure Data Lake, Informatica, Guidewire, Python/Spark
Job Description:
We are looking for an experienced Data Architect with a strong background in building scalable data solutions, integrating enterprise systems, and leveraging cloud-based technologies. The ideal candidate will have hands-on experience with Azure, ETL tools, and advanced data pipeline creation, along with deep expertise in the Property and Casualty (P&C) insurance domain and Guidewire Data.
Key Responsibilities:
Required Skills:
Location: Remote
Duration : Long Term
Skills : Azure Data Factory, Azure Data Lake, Informatica, Guidewire, Python/Spark
Job Description:
We are looking for an experienced Data Architect with a strong background in building scalable data solutions, integrating enterprise systems, and leveraging cloud-based technologies. The ideal candidate will have hands-on experience with Azure, ETL tools, and advanced data pipeline creation, along with deep expertise in the Property and Casualty (P&C) insurance domain and Guidewire Data.
Key Responsibilities:
- Design and build metadata-driven data pipelines using tools like Azure Data Factory (ADF) and Informatica.
- Develop and optimize Operational Data Stores (ODS) leveraging Azure Data Lake.
- Implement and manage data solutions on Azure, ensuring efficient cloud resource utilization and cost optimization.
- Use Azure Functions for data processing automation and orchestration.
- Work with Guidewire Data and ensure seamless integration and processing.
- Write robust and scalable code using Python, T-SQL, and Spark to support custom data transformation processes.
- Integrate and process data from diverse sources into Azure Data Lake and SQL Server.
- Knowledge of Hadoop is a plus for handling large-scale data processing and storage needs.
- Utilize prior Property and Casualty (P&C) insurance domain experience to align technical solutions with business requirements.
- Collaborate with stakeholders to gather requirements, define data strategies, and translate business goals into technical implementations.
- Provide clear and effective communication across technical and non-technical teams.
Required Skills:
- Azure: Strong experience with Azure Data Factory, Azure Data Lake, Azure Functions, and general cloud architecture.
- ETL Tools: Expertise in Informatica and building scalable ETL workflows.
- SQL Server: Advanced knowledge of SQL Server and T-SQL for data management and processing.
- Programming: Proficiency in Python and Spark for data engineering tasks.
- Guidewire Data: Experience working with Guidewire Data.
- ODS Development: Proven expertise in building ODS from Data Lakes.
- Cloud Optimization: Hands-on experience in Azure Cloud Cost Optimization strategies.
- P&C Domain: Strong understanding of the Property and Casualty insurance domain.
- Communication Skills: Excellent verbal and written communication skills are crucial for this role.
- Experience working in large-scale data migration projects.
Source : Karwell Technologies