Data Engineer - RTH - Atlantic Partners Corporation
Newark, NJ 07102
About the Job
Job description:
Our direct client is seeking a Big Data Engineer for a 6-month contract-to-hire opportunity with their Newark, NJ location.
9p Manager(Data Engineering Lead, Data Team) - Engineering lead to help the design/build of the CDO enterprise data projects
Responsibilities
9p Manager (Data Engineer, Data Platform Team) - Full Stack Engineer for AWS data platform (to engineer applications integrating with enterprise data platform using AWS)
Responsibilities:
Requirements:
Our direct client is seeking a Big Data Engineer for a 6-month contract-to-hire opportunity with their Newark, NJ location.
9p Manager(Data Engineering Lead, Data Team) - Engineering lead to help the design/build of the CDO enterprise data projects
Responsibilities
- Implement and support end-to-end data lake/warehousing/mart/business intelligence/ analytics/ services solutions (ingest, storage, integration, processing, services, access) in AWS
- data lake data intake/request/onboarding services and service documentation
- data lake ingestion services for batch/real time data ingest and service documentation
- data processing services (ETL/ELT) for batch/real time (Glue/Kinesis/EMR) and service documentation
- data storage services for data lake (S3)/ data warehouses (RDS/Redshift)/ data marts and service documentation
- data services layer including Athena, Redshift, RDS, microservices and APIs
- pipeline orchestration services including lambda, step functions, MWAA (optional)
- data security services (IAM/KMS/SM/encryption/anonymization/RBAC) and service documentation
- data access provisioning services (Accounts, IAM Roles RBAC), processes, documentation and education
- data provisioning services for data consumption patterns including microservices, APIs and extracts
- metadata capture and catalog services for data lake(S3/Athena), data warehouses (RDS/Redshift), Microservices/APIs
- metadata capture and catalog services for pipeline/log data for monitoring /support
- Implement CI/CD pipelines
- Prepare documentation for data projects utilizing AWS based enterprise data platform
- Implement high velocity streaming solutions using Amazon Kinesis, SQS, and SMS
- Migrate data from traditional relational database systems to AWS relational databases such as Amazon RDS, Aurora, and Redshift
- Migrate data from traditional file systems and NAS shares to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift
- Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift
- Provide cost/spend monitoring reporting for AWS based data platform data initiatives
- Provide governance/audit reporting for access of AWS based data platform
- Lead the implementation of a data lake strategy to enable LOBs and Corporate Functions with a robust, holistic view of data - driven decision making
- Serve as delivery lead for EDP data initiatives product owner
- Partner with immediate engineering team, product owner, IT, partners on EDP agenda
- Provide technology thought leadership, consulting, and coaching/mentoring
- Establish development, qa, stage and production migration/support processes
- Establish best practices for development and support teams
- Deliver end-end data initiatives from ingest-consume via microservices/ apis, jdbc/ odbc, file extracts etc.
- Work with scrum master to develop and own backlog, stories, epics, sprints
- Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience
- Experience implementing, supporting data lakes, data warehouses and data applications on AWS for large enterprises
- Programming experience with Java, Python/ Scala, Shell scripting
- Solid experience of AWS services such as CloudFormation, S3, Glue, EMR/ Spark, RDS, Redshift, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.
- Solid experience implementing solutions on AWS based data lakes
- Experience implementing metadata solutions leveraging AWS non-relational data solutions such as ElastiCache and DynamoDB
- AWS Solutions Architect or AWS Big Data Certification preferred
- Experience in AWS data lake/data warehouse/business analytics
- Experience and understanding of various core AWS services such as IAM, Cloud Formation, EC2, S3, EMR/Spark, Glue, Datasync, CloudHealth, CloudWatch, Lambda, Athena, and Redshift
- Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS
- Experience with DevOps and Continuous Integration/ Delivery (CI/ CD) concepts and tools
- Experience with business intelligence tools such as Tableau, Power BI or equivalent
- Knowledge of ETL/ ELT
- Experience in production support from Level 1 to Level 3
- Awareness of Data Management & Governance tools
- Working experience with Hadoop, HDFS, SQOOP, Hive, Python, and Spark is desired
- Experience working on Agile projects
9p Manager (Data Engineer, Data Platform Team) - Full Stack Engineer for AWS data platform (to engineer applications integrating with enterprise data platform using AWS)
Responsibilities:
- Designing, building and maintaining efficient, reusable, and reliable architecture and code
- Ensure the best possible performance and quality of high scale web applications and services
- Design and build highly performant function-based API's
- Developing microservices using Node.js / python/Java
- Ensures all code has been unit tested prior to QA, including interfaces with other systems
Requirements:
- 5+ years of experience as Full stack developer
- Experience with writing python/ Java nodeJS code for Lambda function
- Experience developing business applications using NoSQL/ SQL databases
- Experience working with Object stores(S3) and JSON is must have
- Experience developing web applications is a must have
- Experience integrating SSO/LDAP/AD is a must have
- Experience working with REST API's, code packages, deployment tools
- Should have good experience with AWS Services - API Gateway, Lambda, Step Functions, SQS, DynamoDB, S3, ElasticSearch
- Serverless application development using AWS Lambda
- Experience in FE technologies like React JS is a must have
- Experience in BE development (microservices/APIs) is a must have
- Experience in metadata capture/discovery tools like CKAN a big plus
- Experience in AWS based data environment is a plus
- Experience integrating with third party tools is a must have
- Experience in CI/CD, DevOps is a must have
- Bachelor's degree in Computer Science, similar technical field, or equivalent professional experience
Source : Atlantic Partners Corporation