Bangalore
1 day ago
Software Architect I-Snowflake Architect
Role: Snowflake Architect – Data Platform Migration & Engineering Experience Level: 12+ years (with minimum 3–4 years in Snowflake) Role Overview:

UST is looking for a Hands-on Snowflake Architect to lead the design, implementation, and optimization of Snowflake-based data architectures. This role is central to delivering end-to-end data platform migration projects and building modern data pipelines using tools like DBT, SQL, and Apache Airflow. As a technical leader and client-facing consultant, you’ll balance engineering excellence with strategic guidance to drive successful outcomes across cloud-based analytics platforms.

Key Responsibilities:

Lead and execute full-scale data warehouse migration projects to Snowflake from legacy systems (e.g., Teradata, Oracle, SQL Server, Redshift).

Design robust and scalable Snowflake architectures aligned with client needs—covering security, performance tuning, cost governance, and data sharing.

Build and maintain modular data models and transformations using DBT and advanced SQL techniques.

Develop and orchestrate ETL/ELT pipelines using Apache Airflow, integrating various ingestion and transformation tools.

Create reusable frameworks and automation scripts for ingestion, transformation, and monitoring.

Collaborate closely with clients to gather requirements, define technical roadmaps, and execute data strategy.

Drive performance tuning, data partitioning, warehouse optimization, and resource monitoring within Snowflake.

Champion best practices for architecture design, version control (Git), automated testing, and CI/CD.

Mentor and guide junior engineers and client-side developers, while remaining hands-on in architecture reviews and code development.

Required Skills & Experience:

10+ years of experience in data engineering or data architecture, including 3–4 years of hands-on Snowflake implementation and optimization.

Proven success in migrating data platforms to Snowflake from traditional data warehouses.

Deep expertise in DBT (Data Build Tool): including macros, reusable models, testing, and documentation.

Advanced proficiency in SQL development and query performance tuning.

Strong experience with Apache Airflow for orchestration and workflow scheduling.

Demonstrated capability in ETL/ELT design, pipeline automation, and monitoring.

Experience working within cloud ecosystems (AWS, GCP, or Azure), leveraging cloud-native tools and services.

Skilled in client-facing consulting roles: conducting workshops, technical demos, and architecture sessions.

Familiarity with version control (Git), CI/CD workflows, and infrastructure-as-code principles.

Preferred Qualifications:

Certifications in Snowflake, DBT, or related cloud technologies.

Experience with data quality tools (e.g., Great Expectations), metadata management, or data governance frameworks.

Por favor confirme su dirección de correo electrónico: Send Email