NSO DevOps engineer
airtel dth
Key Deliverables
• Automation is at the core of this practice. • Taking care of Continuous Integration (CI)/continuous Testing/Continuous delivery • Deploy, manage and maintain BigData pipeline with technologies like - Kafka, ElasticSearch, Druid, Envoy, Zeppelin, Kubernetes, Docker containers • Keeping pipeline/infra up 24x7 • Monitor pipeline. Take proactive measures to keep pipeline up and healthy. • Quickly debug and resolve issues. • Develop automation scripts for repetitive tasks like rebalancing partitions in kafka, making disk space etc., auto deployment of code. • Writing scripts to automate tests and validate data pipeline • Automating and streamlining the software development and infrastructure management processes • Application Performance Monitoring, Logging • Communication, Collaboration • Understand organisation’s need and able to apply solutions, tools, standard methodologies from an immediate/dire need to a long-term perspective/vision • Guide team development efforts towards successful project delivery. (E4 and above) • Provide technical leadership to teammates through coaching and mentorship. (E4 and above)
Skills Required
• Experience in operating large scale infrastructure - Kafka, Druid, Elasticsearch, • Knowledge of application monitoring tools to troubleshoot and diagnose environment issues • Experience with Jenkins, CICD. • Proficiency in Shell, Python and Go programming • Solid understanding of operating systems and networking principles • Good understanding of Linux • Plus: Familiar with automation tools such as Helm charts, Ansible etc. • Plus: Experience in microservices and containerized technologies (Docker, Kubernetes, etc.) • Plus: Experience with modern web architectures & cloud platforms (AWS, GCP, Azure, etc.)
Educational Qualifications
Bachelor's degree in computer science, Computers Engineering or comparable experience
Work Experience
2+ years of hands on experience in software development. (edited basis level of hire) "
• Automation is at the core of this practice. • Taking care of Continuous Integration (CI)/continuous Testing/Continuous delivery • Deploy, manage and maintain BigData pipeline with technologies like - Kafka, ElasticSearch, Druid, Envoy, Zeppelin, Kubernetes, Docker containers • Keeping pipeline/infra up 24x7 • Monitor pipeline. Take proactive measures to keep pipeline up and healthy. • Quickly debug and resolve issues. • Develop automation scripts for repetitive tasks like rebalancing partitions in kafka, making disk space etc., auto deployment of code. • Writing scripts to automate tests and validate data pipeline • Automating and streamlining the software development and infrastructure management processes • Application Performance Monitoring, Logging • Communication, Collaboration • Understand organisation’s need and able to apply solutions, tools, standard methodologies from an immediate/dire need to a long-term perspective/vision • Guide team development efforts towards successful project delivery. (E4 and above) • Provide technical leadership to teammates through coaching and mentorship. (E4 and above)
Skills Required
• Experience in operating large scale infrastructure - Kafka, Druid, Elasticsearch, • Knowledge of application monitoring tools to troubleshoot and diagnose environment issues • Experience with Jenkins, CICD. • Proficiency in Shell, Python and Go programming • Solid understanding of operating systems and networking principles • Good understanding of Linux • Plus: Familiar with automation tools such as Helm charts, Ansible etc. • Plus: Experience in microservices and containerized technologies (Docker, Kubernetes, etc.) • Plus: Experience with modern web architectures & cloud platforms (AWS, GCP, Azure, etc.)
Educational Qualifications
Bachelor's degree in computer science, Computers Engineering or comparable experience
Work Experience
2+ years of hands on experience in software development. (edited basis level of hire) "
Por favor confirme su dirección de correo electrónico: Send Email