Data Engineer (Middle) ID26045 - AgileEngine
Boca Raton, FL 33427
About the Job
AgileEngine is one of the Inc. 5000 fastest-growing companies in the US and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions
If you like a challenging environment where you’re working with the best and are encouraged to learn and experiment every day, there’s no better place - guaranteed! :)
What you will do
Work collaboratively with other engineers, architects, data scientists, analytics teams, and business product owners in an agile environment;
Architect, build, and support the operation of Cloud and On-Premises enterprise data infrastructure and tools;
Design and build robust, reusable, and scalable data driven solutions and data pipeline frameworks to automate the ingestion, processing and delivery of both structured and unstructured batch and real-time streaming data;
Build data APIs and data delivery services to support critical operational processes, analytical models and machine learning applications;
Assist in selection and integration of data related tools, frameworks, and applications required to expand platform capabilities;
Understand and implement best practices in management of enterprise data, including master data, reference data, metadata, data quality and lineage.
Must haves
4+ years of experience with Python or Java (Python preferably, or willingness to work with this language);
4+ years of experience in building data lake, cloud data platform leveraging cloud (GCP/AWS) cloud native architecture, ETL/ELT, and data integration ;
Three years of development experience with cloud services ( AWS,GCP,AZURE) utilizing various support tools (e.g. GCS, Dataproc, Cloud Data flow, Airflow(Composer), Kafka , Cloud Pub/Sub);
Expertise in developing distributed data processing and Streaming frameworks and architectures (Apache Spark, Apache Beam, Apache Flink);
Experience with Snowflake is a must;
In-depth knowledge of NoSQL database technologies (e.g. MongoDB, BigTable, DynamoDB);
Expertise in build and deployment tools – (Visual Studio, PyCharm, Git/Bitbucket/Bamboo, Maven, Jenkins, Nexus);
4+ years of experience and expertise in database design techniques and philosophies (e.g. RDBMS, Document, Star Schema, Kimball Model);
4 years of experience with integration and service frameworks (e.g API Gateways, Apache Camel, Swagger API, Zookeeper, Kafka, messaging tools, microservices);
Expertise with containerized Microservices and REST/GraphQL based API development;
Experience leveraging continuous integration/development tools (e.g. Jenkins, Docker, Containers, OpenShift, Kubernetes, and container automation) in a Ci/CD pipeline;
Advanced understanding of software development and research tools;
Attention to detail and results oriented, with a strong customer focus;
Ability to work as part of a team and independently;
Analytical and problem-solving skills;
Problem-solving and technical communication skills;
Ability to prioritize workload to meet tight deadlines;
Should have working experience with wealth asset management projects;
Upper-intermediate English level.
Nice to haves
Airflow;
Big Query;
Kafka.
The benefits of joining us
Professional growth
Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.
Competitive compensation
We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.
A selection of exciting projects
Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.
Flextime
Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
4+ years of experience with Python or Java (Python preferably, or willingness to work with this language); 4+ years of experience in building data lake, cloud data platform leveraging cloud (GCP/AWS) cloud native architecture, ETL/ELT, and data integration ; Three years of development experience with cloud services ( AWS,GCP,AZURE) utilizing various support tools (e.g. GCS, Dataproc, Cloud Data flow, Airflow(Composer), Kafka , Cloud Pub/Sub); Expertise in developing distributed data processing and Streaming frameworks and architectures (Apache Spark, Apache Beam, Apache Flink); Experience with Snowflake is a must; In-depth knowledge of NoSQL database technologies (e.g. MongoDB, BigTable, DynamoDB); Expertise in build and deployment tools – (Visual Studio, PyCharm, Git/Bitbucket/Bamboo, Maven, Jenkins, Nexus); 4+ years of experience and expertise in database design techniques and philosophies (e.g. RDBMS, Document, Star Schema, Kimball Model); 4 years of experience with integration and service frameworks (e.g API Gateways, Apache Camel, Swagger API, Zookeeper, Kafka, messaging tools, microservices); Expertise with containerized Microservices and REST/GraphQL based API development; Experience leveraging continuous integration/development tools (e.g. Jenkins, Docker, Containers, OpenShift, Kubernetes, and container automation) in a Ci/CD pipeline; Advanced understanding of software development and research tools; Attention to detail and results oriented, with a strong customer focus; Ability to work as part of a team and independently; Analytical and problem-solving skills; Problem-solving and technical communication skills; Ability to prioritize workload to meet tight deadlines; Should have working experience with wealth asset management projects; Upper-intermediate English level.