Data Engineer
Ford Motor Company
As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to Ford Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for Ford Credit.
Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform.
· Design and build production data engineering solutions on Google Cloud Platform (GCP) using services such as BigQuery, Dataflow, DataForm, Astronomer, Data Fusion, DataProc, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Artifact Registry, GCP APIs, Cloud Build, App Engine, and real-time data streaming platforms like Apache Kafka and GCP Pub/Sub.
· Design new solutions to better serve AI/ML needs.
· Lead teams to expand our AI-enabled services.
· Partner with governance teams to tackle key business needs.
· Collaborate with stakeholders and cross-functional teams to gather and define data requirements and ensure alignment with business objectives.
· Partner with analytics teams to understand how value is created using data.
· Partner with central teams to leverage existing solutions to drive future products.
· Design and implement batch, real-time streaming, scalable, and fault-tolerant solutions for data ingestion, processing, and storage.
· Create insights into existing data to fuel the creation of new data products.
· Perform necessary data mapping, impact analysis for changes, root cause analysis, and data lineage activities, documenting information flows.
· Implement and champion an enterprise data governance model.
· Actively promote data protection, sharing, reuse, quality, and standards to ensure data integrity and confidentiality.
· Develop and maintain documentation for data engineering processes, standards, and best practices.
· Ensure knowledge transfer and ease of system maintenance.
· Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures.
· Provide production support by addressing production issues as per SLAs.
· Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
· Work within an agile product team.
· Deliver code frequently using Test-Driven Development (TDD), continuous integration, and continuous deployment (CI/CD).
· Continuously enhance your domain knowledge.
· Stay current on the latest data engineering practices.
· Contribute to the company's technical direction while maintaining a customer-centric approach.
GCP certified Professional Data Engineer
· Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions.
· 5+ years of complex SQL development experience
· 2+ experience with programming languages such as Python, Java, or Apache Beam.
· Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications to production-scale solutions.
· In-depth understanding of GCP’s underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage
· DevOps tools such as Tekton, GitHub, Terraform, Docker.
· Expert in designing, optimizing, and troubleshooting complex data pipelines.
· Experience developing and deploying microservices architectures leveraging container orchestration frameworks
· Experience in designing pipelines and architectures for data processing.
· Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques.
· Self-directed, work independently with minimal supervision, and adapts to ambiguous environments.
· Evidence of a proactive problem-solving mindset and willingness to take the initiative.
· Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management.
· Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity.
· Master’s degree in computer science, software engineering, information systems, Data Engineering, or a related field.
· Data engineering or development experience gained in a regulated financial environment.
· Experience in coaching and mentoring Data Engineers
· Project management tools like Atlassian JIRA
· Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment.
· Experience with data security, governance, and compliance best practices in the cloud.
· Experience using data science concepts on production datasets to generate insights
**Requisition ID** : 48745
Por favor confirme su dirección de correo electrónico: Send Email