Big Data Architect with Security Clearance - SAIC
Chantilly, VA 20151
About the Job
Description SAIC is seeking a experienced, results-oriented, mission-driven Big Data Architect with specialized focus on Data Engineering to perform data model design, data formatting, and ETL development optimized for efficient storage, access, and computation in support of national security objectives.
Responsibilities include, but are not limited to :
• As part of an Agile team, increase innovation capacity and drive the velocity of development of data ingestion and data analysis.
• Synchronize efforts with other tasks in assembling data technologies to control the flow of data from source to value, with the goal of speeding up the process of deriving value and insight.
• Passion for unlocking the secrets held by a dataset and solid understanding and experience with developing, automating, and enhancing all parts of the data pipeline to include ingestion, processing, storage, and exposing data for consumption.
• Implements data tests for quality and also focuses on improving inefficient tooling and adopting new transformative technologies while maintain operational continuity.
#LI-CKT Qualifications Required:
• Active TS/SCI with Polygraph Clearance
• Bachelor's Degree in Computer Science, Information Systems, Engineering or years of experience in lieu of degree
• 14 years of overall related professional experience
• 3+ years' experience with hands on development experience using Java, JavaScript, Python to ETL data.
• ETL experience, including formats such as XML, JSON and YML, and normalizing data and high-volume data ingestion.
• 3+ years' experience using and ingesting data into SQL and NoSQL database systems
• Familiarity with the NEXIS platform
• Experience with Apache NiFi
• Experience programming in Apache Spark and PySpark Desired:
• Familiarity with building Containerized Services (e.g.
via Docker)
• Familiarity with Databricks platform
• Experience developing and maintaining data processing flows.
• Experience with Amazon Web Services (AWS)
• Experience with CI/CD pipeline
• Experience with Agile Methodologies and Kanban Framework
• Experience with utilizing relational databases including the use of MySQL and/ or Oracle for designing database schemas
• Experience with Linux, REST services, and HTTP SAIC accepts applications on an ongoing basis and there is no deadline.
Covid Policy: SAIC does not require COVID-19 vaccinations or boosters.
Customer site vaccination requirements must be followed when work is performed at a customer site.
Responsibilities include, but are not limited to :
• As part of an Agile team, increase innovation capacity and drive the velocity of development of data ingestion and data analysis.
• Synchronize efforts with other tasks in assembling data technologies to control the flow of data from source to value, with the goal of speeding up the process of deriving value and insight.
• Passion for unlocking the secrets held by a dataset and solid understanding and experience with developing, automating, and enhancing all parts of the data pipeline to include ingestion, processing, storage, and exposing data for consumption.
• Implements data tests for quality and also focuses on improving inefficient tooling and adopting new transformative technologies while maintain operational continuity.
#LI-CKT Qualifications Required:
• Active TS/SCI with Polygraph Clearance
• Bachelor's Degree in Computer Science, Information Systems, Engineering or years of experience in lieu of degree
• 14 years of overall related professional experience
• 3+ years' experience with hands on development experience using Java, JavaScript, Python to ETL data.
• ETL experience, including formats such as XML, JSON and YML, and normalizing data and high-volume data ingestion.
• 3+ years' experience using and ingesting data into SQL and NoSQL database systems
• Familiarity with the NEXIS platform
• Experience with Apache NiFi
• Experience programming in Apache Spark and PySpark Desired:
• Familiarity with building Containerized Services (e.g.
via Docker)
• Familiarity with Databricks platform
• Experience developing and maintaining data processing flows.
• Experience with Amazon Web Services (AWS)
• Experience with CI/CD pipeline
• Experience with Agile Methodologies and Kanban Framework
• Experience with utilizing relational databases including the use of MySQL and/ or Oracle for designing database schemas
• Experience with Linux, REST services, and HTTP SAIC accepts applications on an ongoing basis and there is no deadline.
Covid Policy: SAIC does not require COVID-19 vaccinations or boosters.
Customer site vaccination requirements must be followed when work is performed at a customer site.
Source : SAIC