Big Data Architect - Yottaa
Waltham, MA 02451
About the Job
Yottaa is seeking a Big Data Architect who will be a key technical leader within Yottaa, responsible for architecting and managing complex data solutions that support our advanced website performance and security product. Reporting to the CTO, you will lead data architecture design, drive data strategy, and partner with cross-functional teams to build scalable, secure, and high-performance big data solutions. This role will leverage both technical acumen and business insights to transform large datasets into actionable intelligence, improving the functionality, efficiency, and insights of our platform.
Your Role
Data Architecture and Strategy
- Develop and implement the data architecture and strategy for Yottaa’s platform, aligning with business and technical goals.
- Create data models, define best practices, and establish data management processes to support product development and scalability.
- Collaborate with the engineering team to design, implement, and optimize data pipelines and big data frameworks.
Solution Design and Implementation
- Architect, build, and manage large-scale data processing systems to analyze, process, and store vast amounts of data generated from customer websites.
- Lead the selection and implementation of cloud-based and on-premise data technologies.
- Ensure data architecture solutions are secure, scalable, and cost-effective, optimizing for performance and resilience.
Cross-Functional Collaboration
- Work closely with product and engineering teams to understand data requirements and align data architecture with the needs of the platform.
- Ensure data solutions adhere to security and compliance industry regulations and best practices.
Data Governance and Optimization
- Establish and enforce data governance policies and procedures, ensuring data quality, security, and compliance.
- Identify opportunities to streamline and optimize data processes, improving data accessibility, accuracy, and processing speed.
- Proactively troubleshoot and resolve data-related technical issues and challenges.
Your Experience
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field; advanced degree preferred.
- 7+ years of experience in big data architecture, data engineering, or related roles, with experience in a SaaS environment preferred.
- Expertise in big data technologies (e.g., Hadoop, Spark, Kafka) and cloud platforms (AWS, Google Cloud, Azure).
- Proficient in SQL, NoSQL, and data warehousing solutions.
- Strong experience with data modeling, ETL processes, and data pipeline architectures.
- Familiarity with data governance, compliance, and security practices (e.g., GDPR, CCPA).
- Excellent problem-solving abilities, with a strategic mindset and a focus on scalability.
- Strong interpersonal skills, with the ability to communicate complex data concepts to technical and non-technical stakeholders.
- Self-motivated, detail-oriented, and able to work both independently and collaboratively in a fast-paced environment.