San Francisco, CA
112 days ago
Software Engineer, Data Mobility
About the Team

Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on experienced engineers to help us further our 24x7, global infrastructure system that powers DoorDash’s three-sided marketplace of consumers, merchants, and dashers.

About the Team

The Data Ingestion team at DoorDash is at the forefront of managing the seamless movement of trillions of telemetry and transaction data points from diverse sources to our data lakehouse in real-time. By integrating this data with our online systems, we empower multiple business lines, drive critical machine learning models, and fuel fast-paced experimentation. Our team leverages cutting-edge open-source technologies such as Apache Spark, Flink, Kafka, Airflow, Delta Lake, and Iceberg to build and maintain a scalable, high-quality data ingestion framework. As a key player in this innovative and dynamic team, you will help evolve our systems to support DoorDash’s expanding international footprint and ensure the highest standards of reliability and flexibility. This hybrid role requires you to be located in the Bay Area or Seattle.

You’re excited about this opportunity because you will… High Impact: Contribute to powering multiple business lines with high-quality, low-latency data directly integrated into online systems, driving billions in revenue. Cutting-Edge Technology: Work with advanced open-source technologies such as Apache Spark, Flink, Kafka, Airflow, Delta Lake, and Iceberg. Scalability: Play a crucial role in evolving our systems to accommodate a 10x scale increase, supporting DoorDash’s expanding international footprint. Innovation and Excellence: Be part of a team that drives innovation and maintains high standards of reliability and flexibility in our data infrastructure. Cross-Functional Collaboration: Collaborate closely with cross-functional teams in Analytics, Product, and Engineering to ensure stakeholder satisfaction with the data platform's roadmap. Career Growth: Join a dynamic and growing company where your contributions are recognized and valued, providing excellent visibility and opportunities for professional development. We’re excited about you because… B.S., M.S., or PhD. in Computer Science or equivalent 2+ years of experience with CS fundamental concepts and experience with at least one of the programming languages of Scala, Java, and Python OR 2+ years of production experience with at least one of the programming languages such as Scala, Java, and Python. Very good understanding of SQL. You are located or are willing to locate to the Bay Area or Seattle Prior technical experience in Big Data solutions - you've built meaningful pieces of data infrastructure. Bonus if those were open-sourced big data processing frameworks using technologies like Spark, Airflow, Kafka, Flink, Iceberg, Deltalake Experience improving efficiency, scalability, and stability of data platforms

 

Notice to Applicants for Jobs Located in NYC or Remote Jobs Associated With Office in NYC Only

We use Covey as part of our hiring and/or promotional process for jobs in NYC and certain features may qualify it as an AEDT in NYC. As part of the hiring and/or promotion process, we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound from August 21, 2023, through December 21, 2023, and resumed using Covey Scout for Inbound again on June 29, 2024.

The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: Covey

Por favor confirme su dirección de correo electrónico: Send Email