Do you want to be part of an enterprise data solutions team managing over 4 petabytes of data and building the next-generation analytics platform for a leading financial firm with over $10 trillion in assets under management? At Schwab, the Schwab Data Operational data Exchange (ODX) organization owns the strategy, implementation, delivery, and support of the enterprise data warehouse and emerging data platforms.
We are looking for someone who has a passion for data and comes with software engineering specializing in data. Someone who has experience designing streaming real-time and coding batch ETL (and ELT) workflows. Who wants to be part of the Data Exchange team that is actively designing and implementing the Enterprise Data solutions. Someone who wants to be challenged every day and has a passion for keeping up to date on new technologies.
What you’ll do:
Design, develop, and maintain scalable data streaming pipelines using Java, Spring, and GCP native services such as Pub/Sub, Dataflow, or alternatives like Kafka and RabbitMQ.Develop and unit test high-quality, maintainable code; partner with QA to ensure comprehensive test coverage and zero-defect production releases.Design, develop, and manage front-end self-service portal using React.Build reliable batch ingestion jobs to integrate HR data from multiple upstream sources into the Operational Data Exchange (ODX) database.Streamline, simplify, and performance-tune batch and streaming data loads to improve throughput and minimize latency.Collaborate closely with business stakeholders and upstream application teams to understand requirements, align on data contracts, and build trusted relationships.Work with Production Support and Platform Engineering teams to triage and resolve production issues promptly, while ensuring data security and platform reliability.Follow agile and release management best practices to ensure smooth deployments and prevent production install failures.Stay current with evolving technologies and trends; continuously learn and apply modern patterns for data engineering and streaming.Communicate effectively across technical and non-technical audiences; demonstrate ownership, adaptability, and a collaborative mindset. What you haveWhat MUST you have?
Minimum 7 years of hands-on development experience using parallel processing databases like Teradata, Google Big Query. Must have 5+ years’ experience in Java Spring boot, and preferably Google Cloud Platform, and Informatica IICSMust have 2+ years’ experience in developing front-end applications using React.Experience in data streaming technologies like Kafka, RabbitMQExperience with all aspects of data systems, including database design, ETL, aggregation strategy, performance optimization. Experience setting best practices for building and designing code and strong Java & SQL experience to develop, tune, and debug complex applications.Expertise in schema design, developing data models, and proven ability to work with complex data is required.Hands-on experience with programming language Java/Python/SparkHands-on experience with Linux and shell scriptingHands-on experience with CI/CD tools like Bamboo, Jenkins, Bitbucket, etc. Options Apply for this jobApplyShareRefer a friendRefer Sorry the Share function is not working properly at this moment. Please refresh the page and try again later. Share on your newsfeed Why work for us?Own Your Tomorrow embodies everything we do! We are committed to helping our employees ignite their potential and achieve their dreams. Our employees get to play a central role in reinventing a multi-trillion-dollar industry, creating a better, more modern way to build and manage wealth.
Benefits: A competitive and flexible package designed to empower you for today and tomorrow. We offer a competitive and flexible package designed to help you make the most of your life at work and at home—today and in the future. Application FAQs
Software Powered by iCIMS
www.icims.com