Senior Cybersecurity Engineer - PKI (Remote)
Home Depot
Position Purpose:
This position is part of the PKI team in CyberSecurity. This team supports applications and devices across the entire Home Depot enterprise. This role will work on platforms issuing and managing certificates and gathering data related to certificates.
This role will have a development and application support focus, experience in application and system development required.
Proficiency in programming languages such as Python, Java, or Scala.Experience with scripting languages (e.g., Bash, PowerShell) for automation tasks.Proficiency with Google Cloud.Knowledge of data extraction and ingestion processes to support the PKI certificate data lakeProficiency of Cloud Functions / Cloud Run with proven experience on using workload identify federation from Github;Experience in design and implement the exception captures, retry logic, and use deadletter concept to retrieve failed jobs for asynchronous execution from the scriptsExperience with Kubernetes and Terraform is a bonus.Familiarity with PKI processes and solutions is preferred. This includes CA and CLM solutions (Digicert, Venafi, EJBCA, etc). Proficiency of Cloud Functions / Cloud Run with proven experience on using workload identify federation from Github;+Experience in design and implement the exception captures, retry logic, and use deadletter concept to retrieve failed jobs for asynchronous execution from the scripts1. Cloud Computing Skills
Cloud Platforms: Proficiency in at least Google CloudCloud Services: Experience with cloud storage solutions (e.g., Amazon S3, Azure Blob Storage, Google Cloud Storage) and compute services (e.g., EC2, Azure VMs, Google Compute Engine).2. Data Engineering Skills
ETL/ELT Processes: Knowledge of data extraction, transformation, and loading (ETL/ELT) processes and tools (e.g., Apache NiFi, AWS Glue, Azure Data Factory).Data Ingestion: Experience with real-time data ingestion tools (e.g., Apache Kafka, AWS Kinesis, Azure Event Hubs).Data Modeling: Ability to design and implement data models and schemas for efficient data storage and retrieval.3. Programming and Scripting Skills
Programming Languages: Proficiency in programming languages such as Python, Java, or Scala.Scripting: Experience with scripting languages (e.g., Bash, PowerShell) for automation tasks.4. Big Data Technologies
Data Processing Frameworks: Experience with big data processing frameworks like Apache Spark, Hadoop, or Flink.Query Engines: Knowledge of query engines like Presto, AWS Athena, Azure Synapse, or Google BigQuery.5. Database Management
SQL and NoSQL: Proficiency in SQL for querying relational databases and familiarity with NoSQL databases (e.g., MongoDB, Cassandra).Data Warehousing: Experience with data warehousing solutions (e.g., Amazon Redshift, Azure Synapse, Google BigQuery).
Key Responsibilities:
Direct Manager/Direct Reports:
Travel Requirements:
Physical Requirements:
Working Conditions:
Minimum Qualifications:
Minimum Education:
Minimum Years of Work Experience:
Competencies:
Por favor confirme su dirección de correo electrónico: Send Email