Microsoft Fabric Data Architect - Kasmo Inc
Plano, TX 75074
About the Job
Responsibilities
- Own the design and implementation of Microsoft Fabric architectures to support sophisticated analytics and data processing requirements
- Be responsible for the development and optimization of MS Fabric Job Definitions to ensure efficient data processing and transformation.
- Design the OneLake to handle and integrate diverse data sources into a unified data platform.
- Develop and maintain SQL queries and scripts to support data extraction transformation and loading processes.
- Apply PySpark to build scalable data pipelines and perform complex data transformations.
- Collaborate with multi-functional teams to understand business requirements and translate them into technical solutions.
- Ensure data quality and integrity by implementing robust data validation and cleansing processes.
- Monitor and fix data processing workflows to ensure optimal performance and reliability.
- Mentor and guide junior team members in standard methodologies for data architecture and engineering.
- Provide technical leadership in the domain of Healthcare and Health Plan domains demonstrating domain knowledge to enhance data solutions.
- Contribute to the development of data governance policies and procedures to ensure compliance with regulatory requirements.
- Communicate effectively with stakeholders to present technical solutions and gather feedback
Requirements:
- 12+ years of experience in implementing large data and analytics platforms
- 5+ years of experience architecting and building data platforms on Azure
- Hands-on experience with cloud data platforms such as Azure Synapse Analytics, Azure Databricks, or Microsoft Fabric.
- Ability to architect, implement, and optimize data workflows and pipelines on these platforms.
- In-depth understanding of data engineering principles, data architecture, and data management practices.
- Proficiency in designing and implementing scalable data solutions.
- Exhibit strong to expert skills in PySpark for building scalable data pipelines
- Demonstrate proficiency in developing and optimizing Spark job definitions
- Have in-depth knowledge of OneLake for data integration and management
- Show expertise in writing and optimizing SQL queries and scripts
- Understanding of data modelling, data mappings, understand the data model changes and propose the best practices and guidelines
- Experience in Healthcare or Health Plans is a plus
- Strong problem-solving skills and attention to detail
Source : Kasmo Inc