Cyber Data Engineer at Advanity Technologies Llc
IRVING, TX 75039
About the Job
Job Title: Cyber Data Engineer
Location: Remote (Must be based in the U.S.)
Duration: 3 months
W2, all-inclusive
Key Responsibilities:
Design and implement secure data pipelines for efficient support of cybersecurity operations.
Optimize data architectures that integrate various cybersecurity tools, platforms, and data sources to ensure robust data flows and high availability.
Collaborate with cybersecurity analysts, data scientists, and infrastructure teams to develop data solutions that facilitate effective security analysis and threat detection.
Ensure compliance with internal security policies and industry regulations regarding data handling and storage.
Implement encryption, data masking, and other security techniques to protect sensitive information.
Monitor data pipeline performance, troubleshoot issues, and continuously optimize processes for reliability, scalability, and efficiency.
Create and enforce data validation and quality control measures to maintain data integrity across all processing stages.
Maintain thorough documentation of data pipelines, architectures, and procedures. Provide regular updates and reports on data pipeline health and performance to stakeholders.
Requirements:
Proven experience in data engineering or related roles, specifically designing secure data systems within the cybersecurity field or similar industries.
Strong proficiency in programming languages such as Python, Java, or Scala.
Experience with cloud platforms (AWS, Azure, or Google Cloud) and cloud-based data services (e.g., Redshift, BigQuery, S3).
Expertise in developing and maintaining ETL processes.
Familiarity with data security protocols and techniques, including encryption, secure data transfer, and access control mechanisms.
Experience with big data technologies such as Hadoop, Spark, or Kafka is a plus.
Solid understanding of cybersecurity principles, including data protection, threat detection, incident response, and risk management.
Experience with database management systems (both SQL and NoSQL) and modern data pipeline frameworks (e.g., Apache Airflow, dbt).
Location: Remote (Must be based in the U.S.)
Duration: 3 months
W2, all-inclusive
Key Responsibilities:
Design and implement secure data pipelines for efficient support of cybersecurity operations.
Optimize data architectures that integrate various cybersecurity tools, platforms, and data sources to ensure robust data flows and high availability.
Collaborate with cybersecurity analysts, data scientists, and infrastructure teams to develop data solutions that facilitate effective security analysis and threat detection.
Ensure compliance with internal security policies and industry regulations regarding data handling and storage.
Implement encryption, data masking, and other security techniques to protect sensitive information.
Monitor data pipeline performance, troubleshoot issues, and continuously optimize processes for reliability, scalability, and efficiency.
Create and enforce data validation and quality control measures to maintain data integrity across all processing stages.
Maintain thorough documentation of data pipelines, architectures, and procedures. Provide regular updates and reports on data pipeline health and performance to stakeholders.
Requirements:
Proven experience in data engineering or related roles, specifically designing secure data systems within the cybersecurity field or similar industries.
Strong proficiency in programming languages such as Python, Java, or Scala.
Experience with cloud platforms (AWS, Azure, or Google Cloud) and cloud-based data services (e.g., Redshift, BigQuery, S3).
Expertise in developing and maintaining ETL processes.
Familiarity with data security protocols and techniques, including encryption, secure data transfer, and access control mechanisms.
Experience with big data technologies such as Hadoop, Spark, or Kafka is a plus.
Solid understanding of cybersecurity principles, including data protection, threat detection, incident response, and risk management.
Experience with database management systems (both SQL and NoSQL) and modern data pipeline frameworks (e.g., Apache Airflow, dbt).
Salary
80 - 90 /hour