Required Qualifications/Experience
• Bachelor’s degree in Computer Science, Engineering, or a related field.
• 5+ years of experience in data engineering or a similar role.
• Strong proficiency in SQL for data manipulation and transformation - relational databases such as MySQL, PostgreSQL, NoSQL database, etc
• Big data and data modelling: Experience with Snowflake or similar cloud data platforms. Hands-on experience in data modelling, ETL processes, Data pipeline development, performance tuning and data warehousing
• Proven experience in building dashboards with Power BI and Tableau
o Power BI - Data Modelling, DAX, Power Query, Report & Dashboard Design, Data Connectivity, Performance Optimization, Row-Level Security
o Tableau - Data Preparation, Calculated Fields, Visual Analytics, Data Blending & Joins, Parameters & Actions, Publishing & Sharing
• Advance SQL such as CTEs, functions, procedures, pivoting, window functions, performance tuning, etc.
• Snowflake environment such as task, procedures, streams, Snowpark, Streamlit, AI/ML studio, etc.
• Excellent problem-solving skills and attention to detail.
________________________________________
Preferred (But not required) Experience
• Experience with cloud platforms like AWS:
o Message brokers such as Kafka, AWS SQS, AWS SNS, Kinesis
o AWS environment such as Lambda functions, SQS, Step Functions
• Knowledge of scripting languages like Python - Python data libraries such as Pandas, NumPy, Airflow, PySpark, Matplotlib, etc.
• Understanding of CI/CD pipelines and version control systems - Git repositories, code versioning, development and integration of CI/CD pipelines, etc.
• Container Platforms and orchestration such as Docker, ECR, Kubernetes.
• Exposure to Agile methodologies and tools like Azure DevOps.