Senior Data Engineer - Neon Redwood
San Francisco, CA
About the Job
About Neon Redwood
Neon Redwood is a data services consulting company, working on cutting-edge AI and data-driven solutions. We are a team of passionate engineers and data experts, and we are currently looking for a Senior Data Engineer to join our team and help us develop and expand our data infrastructure and analytics capabilities.
The Role
We are seeking an experienced Senior Data Engineer with a strong background in data engineering and a passion for working with large-scale data sets. Help us develop and expand our data infrastructure and analytics capabilities.
The ideal candidate will have at least 5 years of professional experience and a solid understanding of Python, BigQuery, and Google Cloud Platform (GCP). This full-time role will involve working closely with our CTO and other team members to design, develop, and maintain data pipelines, ETL processes, and data warehousing solutions.
Responsibilities
- Collaborate with the CTO and other team members to design, develop, and maintain data pipelines and ETL processes.
- Write clean, efficient, and maintainable code in Python and other relevant technologies.
- Implement and optimize data storage and processing solutions using BigQuery and Google Cloud Platform (GCP).
- Ensure data quality and integrity through proper data validation and monitoring techniques.
- Participate in code reviews and provide constructive feedback to improve overall code quality.
- Stay up-to-date with the latest industry trends and technologies to ensure our data infrastructure remains competitive.
- Assist in the development and launch of new data-driven tools and products.
- Mentor and guide junior engineers, fostering a culture of continuous learning and improvement.
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 5+ years of professional data engineering experience.
- Strong proficiency in Python, BigQuery, and Google Cloud Platform (GCP).
- Experience with data pipeline and ETL process design and development.
- Excellent problem-solving skills and the ability to work independently or as part of a team.
- Strong communication, collaboration, and leadership skills.
- Passion for working with large-scale data sets and staying current with industry trends.
Additional Skills (Nice to Have)
- Experience with other data processing technologies and platforms (e.g., Apache Beam, Dataflow, Hadoop, Spark).
- Experience with data visualization tools and libraries (e.g., Tableau, Looker, D3.js).
- Knowledge of machine learning and AI concepts.
- Experience with real-time data processing and streaming technologies (e.g., Kafka, Pub/Sub).
Benefits
Flexible hoursĀ and frequent get togethers
Fully covered Health Insurance for employees and their eligible dependents
Fully covered Vision & Dental for employees and their eligible dependents
Unlimited Time Off
Company 401k plan with employer contributions
Supplemental monthly health and wellness stipend