Experience Range: 3 to 7 years
Hiring Locations: Chennai, Trivandrum, Kochi
The role demands proficiency in designing and developing robust data pipelines using ETL tools and programming languages such as Python, PySpark, and SQL. The candidate will be responsible for coding, testing, and implementing data ingestion and transformation pipelines. Additionally, the role includes L1 Data Operations responsibilities, such as monitoring dashboards, identifying anomalies, and escalating issues as per standard operating procedures.
Key Responsibilities Data Pipeline Development:Independently develop, test, and implement data processing pipelines.
Use tools like Informatica, Glue, Databricks, and DataProc.
Write clean, scalable, and optimized code in Python, PySpark, and SQL.
Conduct thorough unit testing to ensure data accuracy and pipeline stability.
Create clear documentation and maintain project artifacts.
Data Operations (L1 Monitoring):Monitor data pipelines, dashboards, and databases on a shift basis (24x7 support including night shifts).
Identify, log, and escalate anomalies or failures using SOPs and runbooks.
Execute basic SQL queries for data validation and issue resolution.
Collaborate with L2/L3 teams for escalation and root cause analysis.
Maintain logs of incidents and escalations.
Additional Responsibilities:Adhere to project timelines, SLAs, and compliance standards.
Participate in estimation of effort and timelines for assigned work.
Obtain foundational certifications in cloud platforms (Azure, AWS, or GCP).
Contribute to knowledge management, documentation repositories, and release management processes.
Mandatory SkillsProficiency in Python, PySpark, and SQL
Experience with ETL tools like Informatica, AWS Glue, Databricks, or DataProc
Strong understanding of data pipeline design and data wrangling
Hands-on experience in cloud platforms – AWS, Azure, or GCP (especially with data services)
Knowledge of data schemas, transformations, and models
Strong ability to debug and test data processes and troubleshoot issues
Good to Have SkillsFamiliarity with Apache Airflow, Talend, Azure ADF, or GCP DataFlow
Certification in Azure/AWS/GCP data services
Experience in production monitoring, L1 data ops support, and incident escalation
Exposure to windowing functions in SQL and advanced Excel analysis
Knowledge of Agile/Scrum development processes
Soft SkillsStrong written and verbal communication skills
Excellent analytical and problem-solving ability
Ability to work independently with minimal supervision
Keen attention to detail and precision in monitoring tasks
Collaboration and coordination skills for working with cross-functional support teams
Ability to multitask and remain calm in high-pressure, fast-paced environments
Willingness to work in 24x7 shift schedules, including night shifts as required