Line of Service
AdvisoryIndustry/Sector
Not ApplicableSpecialism
Data, Analytics & AIManagement Level
Senior AssociateJob Description & Summary
At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth.In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions.
Why PWC
At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.
At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations.
Job Description & Summary: A career within PWC
Responsibilities:
Cloud Devops Engineer: We are seeking an experienced DevOps Data Engineer to lead the design, automation, and deployment of scalable data infrastructure. This role requires expertise in Infrastructure as Code (IaC), cloud platforms, CI/CD tools, and advanced deployment strategies to ensure reliable and seamless data delivery across environments. Key Responsibilities: Design, build, and maintain end-to-end data pipelines using technologies like Apache Airflow, Spark, Kafka, or AWS Glue. Automate infrastructure provisioning using Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Pulumi. Set up and manage CI/CD pipelines using Jenkins, GitLab CI/CD, or GitHub Actions to enable continuous testing, integration, and deployment of data workflows. Implement version control and code collaboration using Git-based platforms such as GitHub, GitLab, or Bitbucket. Lead and implement end-to-end deployment strategies, including: Blue-Green Deployments for near-zero downtime updates and safe rollback.
Canary Releases for progressive rollout and validation of new features. Rolling Deployments for gradual updates across nodes without downtime. Containerize data processing jobs using Docker and deploy on orchestration platforms like Kubernetes, EKS, AKS, or GKE. Configure infrastructure monitoring, logging, and alerting with tools like Prometheus, Grafana, CloudWatch, ELK Stack, or Datadog. Ensure data infrastructure meets security, compliance, and governance standards. Collaborate with cross-functional teams including data scientists, data analysts, software engineers, and DevOps teams. Required Skills: Proficiency in Python, Bash, or Shell scripting for automation and tooling. Deep expertise in Terraform and other IaC practices. Strong understanding of DevOps principles and CI/CD pipelines. Practical knowledge of Blue-Green, Canary, and Rolling Deployment strategies in production environments.
Experience with Docker and orchestration platforms like Kubernetes. Solid grasp of cloud services: AWS (Glue, S3, Redshift, Lambda), Azure (Data Factory, Synapse), or GCP (BigQuery, Dataflow). Skilled in Git workflows and version control using GitLab/GitHub. Understanding of performance tuning, cost optimization, and data security in cloud platforms. Preferred Qualifications: Cloud certifications (AWS/Azure/GCP DevOps/Architecture). Experience in implementing disaster recovery, backup strategies, and high-availability systems. Familiarity with observability tools like Monte Carlo or Great Expectations. Experience in Agile/DevOps delivery methodologies.
Mandatory skill sets:
Devops
Preferred skill sets:
Devops
Years of experience required:
3-8 years
Education qualification: BE/BTECH, ME/MTECH, MBA, MCA
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required: Bachelor Degree, Master DegreeDegrees/Field of Study preferred:Certifications (if blank, certifications not specified)
Required Skills
DevOpsOptional Skills
Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more}Desired Languages (If blank, desired languages not specified)
Travel Requirements
Not SpecifiedAvailable for Work Visa Sponsorship?
NoGovernment Clearance Required?
NoJob Posting End Date