Data Engineer – GCP & BigQuery
Roles and Responsibilities:We are seeking talented Data Engineers with a strong background in data migration, integration, and pipeline development to join our dynamic team. You will play a key role in designing, building, and optimizing data flows and infrastructure using Google Cloud Platform (GCP) and BigQuery.
Key Responsibilities:
Migrate data from on-premises databases (especially SQL Server) to Google BigQuery
Design, develop, and manage ETL/ELT pipelines using Apache Airflow, Python, and Spark
Refactor and convert legacy SSIS packages for GCP compatibility and performance
Integrate and normalize data from various structured and semi-structured sources including APIs and external databases
Write, optimize, and manage complex SQL queries, views, and stored procedures in BigQuery
Collaborate with cross-functional teams including Data Analysts, Architects, and DevOps to deliver robust data solutions
Ensure data quality, governance, and security compliance across the pipeline
Must-Have Skills:Strong hands-on experience in Python, SQL, and Google Cloud Platform (GCP)
Proficiency in Google BigQuery – data modeling, query optimization, and cost-efficient design
Experience with data migration from SQL Server/on-prem to cloud-based solutions
Knowledge of ETL tools such as Apache Airflow and SSIS
Familiarity with Apache Spark for data transformation and processing
Practical experience in data warehousing concepts and implementations
Strong problem-solving and performance-tuning skills