Principal Data Science Engineer - Boomi
Los Angeles, CA 90079
About the Job
Are you ready to work at a fast-growing company where you can make a difference? Boomi aims to make the world a better place by connecting everyone to everything, anywhere. Our award-winning, intelligent integration and automation platform helps organizations power the future of business. At Boomi, you’ll work with world-class people and industry-leading technology. We hire trailblazers with an entrepreneurial spirit who can solve challenging problems, make a real impact, and want to be part of building something big. I
JOB DUTIES:
Build cost effective tools and support structures needed to analyze data, perform elements of data cleaning, feature selection, and feature engineering. Organize experiments in conjunction with best practices. Work with multidisciplinary teams across functions to ensure models can be implemented as part of a delivered solution across many clients. Present findings to stakeholders to drive improvements and solutions from concept through to delivery. Prepare reports for executive leadership that effectively communicate trends, patterns, and predictions using relevant data. Apply for patents/innovation disclosures and serve in the reviewing committee. Drive the development of machine learning, deep learning, data mining, and statistical modeling for predictive and prescriptive enterprise analytics to answer key business problems, improve the product, and increase and optimize customer experiences. Lead the development and implementation of NLP techniques, such as sentiment analysis, entity recognition, topic modeling, and summarization to extract insights from unstructured data. Serve as a spokesperson for the business, including publishing and presenting our work at relevant tech/industry conferences. Lead data science projects and deliver impactful business outcomes. Partner with external entities and develop strategic alliances. Evaluate external AI services and products for their potential in building AI-driven solutions. Mentor and guide Junior Data Scientists in the team by providing hands-on technical guidance in data science tools and methodologies. Serving as a scrum master, including planning and task tracking with tools like Jira. Telecommuting allowed from anywhere in the U.S.
REQUIREMENTS:
Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering, Data Science, Integrated Innovation or a related field of study and six (6) years of experience in the job offered, as a Data Scientist, Data Science Engineer, AI/ML Engineer or related occupation. Alternatively, the employer will accept a Master’s degree, or foreign equivalent, in Computer Science, Engineering, Data Science, Integrated Innovation or a related field of study and four (4) years of experience in the job offered, as a Data Scientist, Data Science Engineer, AI/ML Engineer or related occupation.
Requires demonstrated experience with: machine learning techniques (clustering, decision tree learning, artificial neural networks, ensemble methods, and their real-world advantages/drawbacks). Software prototyping and engineering with programming and analytics languages (Python and SQL). Utilizing various open-source machine learning, deep learning, and Natural Language Processing libraries including Scikit-learn, TensorFlow, PyTorch, Keras, Hugging Face Transformers, SpaCy, NLTK, Gensim, Matplotlib, Seaborn, Named Entity Recognition libraries, Pandas, and NumPy to generate deliverable modules and prototype demonstrations. Implementing continuous integration and deployment (CI/CD) processes for ML models using AWS Sagemaker Pipelines, Amazon Lambda, AWS S3 and AWS Sagemaker. Deploying Large Language Models for enterprise use cases applying LLM techniques such as fine-tuning models using LORA, applying document retrieval. Managing and processing graph data using advanced techniques, including Graph Neural Network. Working with DevOps tools (GIT, Splunk, Newrelic, Bitbucket, Harness, Jenkins and Jira). Working with product owners/business analysts to develop models from requirements.