Role Proficiency:
Design develop and implement data architecture for a large enterprise in two of the following: Data ingestion Data Management Data Delivery Data Consumption.
Outcomes:
Architect design and implement high performance large volume data integration transformation warehouse data lakes data in cloud or large volume warehouse data in cloud data analysis reporting and visualization of data from data warehouse using any one tool like(PowerBI Tableau D3 etc) Design and implement any one of cloud big data solution (AWS/AZURE/GCP/Apache Hadoop) Plan information architecture by studying the site concept strategy domain and target audience; information structure and features designing information structure evaluating information representation backup recovery and security specification Customer focus – improves customer service effectively engages customers build with customer on your solution capability. Provide end-to-end technical guidance and roadmaps for Sales & Marketing Validate estimates from team Hold certification in Architect/Big Data specialty certification in one of AWS or Azure or GCP with experience in knowledge of other architecting Big Data solutions on other cloud and Big Data platformsMeasures of Outcomes:
Percent of time spent in a year to develop for customer an architecture or strategy for data Percent of time spent in solving customer problem in data area Percent of time spent for provide data strategy for customer Dollar value of RFP’s won for which solutioning was proposed Dollar value saved for customer using cost reduction or dollar value created for customer by prediction models by you or your teamOutputs Expected:
Strategy & Planning:
Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data usersdepartment managers
clients
and other key stakeholders Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap Establish processes for governing the identification
collection
and use of corporate data assets and metadata; take steps to assure metadata accuracy and validity Establish methods and procedures for tracking data quality
completeness
redundancy
and Improvement Ensure that data strategies and architectures meet regulatory compliance requirements Engage external stakeholders including standards organizations
regulatory bodies
operators
and scientific research communities Ensure business needs in operational data
reporting data are met
Operational Management :
stewardship
and frameworks for managing data across the organization Develop and promote data management methodologies and standards Select and implement the appropriate tools
software
applications
and systems to support data technology goals Collaborate with project managers and business teams for all projects involving enterprise data Address data-related issues with systems integration
compatibility
and multi-platform integration Act as a leader and advocate of data management
including coaching
training
and Develop and implement key components as needed to create testing criteria in order to guarantee the fidelity and performance of data architecture Document the data architecture and environment in order to maintain a current and accurate view of the larger data picture Identify and develop opportunities for data reuse
migration
or retirement
Project Control and Review :
time and asset utilization in complex projects and advise relevant teams accordingly where possible Provide advice to teams facing complex technical issues in the course of project delivery Conduct planned and unplanned technical audits for complex projects as applicable Define and measure project and program specific architectural and technology quality metrics
Knowledge Management & Capability Development :
etc. Identify the training needs and conduct internal sessions to meet the same Partner with UST Gamma to Create curriculum
assessments
training programs
courseware based on new service offerings and solutions etc. Update collateral on to the knowledge management repository Gain and cultivate domain expertise to provide best and optimized solution to customer
Requirement gathering and Analysis:
by working with customers and other stakeholders
People Management:
Alliance Management:
Technology Consulting:
process
tools and arrive at the solution options best fit for the client Analyse Cost Vs. Benefits of solution options Define the technology/ architecture roadmap for the client Articulate the cost vs. benefits of options to key stakeholders in customer
Innovation and Thought Leadership:
paper presentation
etc.) Interact and engage with customers/ partners around new innovative ideas
concepts
assets; as well as industry trends and implications Participate in Beta testing of products / joint lab setup with customer to explore application of new technologies / products Identify areas where components/accelerators or design patterns could be reused across different accounts Create documents
reports
white papers (international/national) on the research findings
Sales Support and Project Estimation:
Data Design & Definition:
requirements compliance of the solution
especially in case of a multi group / multi-vendor program Involve external partners / horizontals as appropriate Provide guidance to the team around the usage of data architecture and experience in solution implementation
product selection and definition
configuration
migration and transformations between cloud and hosted solutions. Take responsibility for developing cutting edge wallet applications Ensure UST architecture principles and QOA (Quality of Architecture) are maintained. Analyse trade-offs and provide recommendations
if any
on the data architecture Identify opportunities for efficiency improvements. Understand the various work streams and the technologies necessary to deliver on large programs Resolve any issues based on industry expertise / partners leverage Provide best in class technology and solution options to customers with detailed examples and case studies. Guide customer on process and technology improvements to achieve agility and quick results
Project Management Support:
Stakeholder Management:
New Service Design:
guides for GTM
Skill Examples:
Use knowledge of technology trends to provide inputs on potential areas of opportunity for UST Use knowledge of Architecture Concepts and Principles to create architecture catering to functional and non-functional requirements under guidance of the specialist. Re-engineer existing architecture solutions under the guidance of the specialist. Provide training on best practices in architecture under guidance Use independent knowledge of Design Patterns Tools and Principles to create high level design for the given requirements. Evaluate multiple design options and choose the appropriate options for best possible trade-offs. Conduct knowledge sessions to enhance team's design capabilities. Review the low and high level design created by Specialists for efficiency (consumption of hardware memory and memory leaks etc.) and maintainability of design Use knowledge of Software Development Process Tools & Techniques to identify and assess incremental improvements for software development process methodology and tools. Take technical responsibility for all stages in the software development process. Conduct optimal coding with clear understanding of memory leakage and related impact. Implement global standards and guidelines relevant to programming and development come up with 'points of view' and new technological ideas Use knowledge of Project Management & Agile Tools and Techniques to support plan and manage medium size projects/programs as defined within UST; identify risks and mitigation strategies Use knowledge of Project Metrics to understand relevance in project. Collect and collate project metrics and share it with the relevant stakeholders Use knowledge of Estimation and Resource Planning to create estimate and plan resources for specific modules / small projects with detailed requirements or user stories in place Use knowledge of Knowledge Management Tools & Techniques to leverage existing material/ re-usable assets in knowledge repository. Independently create and update knowledge artefacts. Create and track project specific KT plans. Provide training to others write white papers/ blogs at internal level write technical documents/ user understanding documents at the end of the project Use knowledge of Technical Standards Documentation & Templates to create documentation appropriate for the project needs. Create documentation appropriate for the reusable assets/ best practices/ case studies Use knowledge of Requirement Gathering and Analysis to support creation of requirements documents or user stories and high level process maps. Identify gaps on the basis of business process analyse responses to clarification questions and produce design documents. Create and review estimates and solutions at project level/program level. Create and review design artefacts update resourcing and schedule based on impacted areas identified. Create design specifically for the non-functional requirements Strong proficiencies in understanding data workflows and dataflows Attention to details High Analytical capabilities Creativity in developing solutions Self-learning Resilience in solving problemsKnowledge Examples:
Domain knowledge understanding domain data Applied Mathematics and Statistics Skilled in writing design documents usage of Power Point Visio Data visualization streaming Data migration Data modelling vector and Graphs RDMSs (relational database management systems) SQL Databases such as NoSQL and cloud computing Hadoop technologies like MapReduce Hive and Pig Programming languages especially Python and Java Operating systems including UNIX and MS Windows Cloud solutions Backup/archival software Understanding of data securityAdditional Comments:
Job Title: Lead Data Quality Engineer (with Cloud-Agnostic Data Engineering Expertise) ________________________________________ Summary We are seeking a highly skilled Lead Data Quality Engineer with a strong foundation in data engineering to lead our data quality and reliability initiatives across modern, cloud-agnostic data platforms. You will drive the strategy, design, and execution of robust data quality frameworks and performance validation processes for large-scale, distributed data systems. The ideal candidate is a hands-on technologist with a passion for clean, accurate, and high-performing data pipelines — and the ability to work across cloud platforms, teams, and tools to ensure data trustworthiness across the enterprise. ________________________________________ Key Responsibilities Data Quality & Performance Engineering (Primary Focus) • Define, implement, and continuously improve data quality frameworks, validation rules, and monitoring systems across data pipelines and APIs. • Lead performance and reliability testing for data lakes, data warehouses, and streaming systems. • Analyze pipeline and API performance to detect and resolve bottlenecks related to latency, throughput, and reliability. • Integrate and manage tools such as Great Expectations, dbt, JMeter, Locust, Grafana, and Prometheus. • Lead a team of SDETs focused on test automation and data platform QA across environments. • Collaborate with business stakeholders to define data quality KPIs and communicate insights from quality monitoring and performance analysis. • Work closely with Data Engineers and Platform Engineers to embed quality controls into CI/CD pipelines and infrastructure-as-code deployments. Data Engineering & Platform Support (Secondary Focus) • Contribute to the design and development of data ingestion, transformation, and storage processes. • Support scalable ETL/ELT pipelines using best-in-class tools and frameworks. • Ensure data governance, lineage, and compliance across distributed cloud environments. • Collaborate with BI and analytics teams to ensure data availability for dashboards, APIs, and machine learning models. • Contribute to infrastructure automation using tools like Terraform, CloudFormation, or other IaC tools. ________________________________________ Required Qualifications • Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. • 12+ years in commercial software engineering • 5+ years of experience in Data Engineering with a focus on Data Quality, and 2+ years in a lead or senior quality engineering role. • Strong hands-on programming skills in Java, Python, and SQL. • Expertise in test automation tools (e.g., Selenium, REST Assured, Postman, Cypress). • Experience working with relational, NoSQL, and document-oriented databases (e.g., PostgreSQL, MongoDB, Elasticsearch). • Familiarity with big data and distributed processing tools such as Apache Spark, Kafka, Hadoop, etc. • Proven experience in performance engineering of data systems (e.g., tuning for scale and latency). • Hands-on experience with at least one major cloud platform (AWS, Azure, GCP); familiarity with others preferred. • Strong communication and stakeholder engagement skills. ________________________________________ Preferred Qualifications • Certifications in AWS, Azure, or GCP related to Data Engineering or DevOps. • Familiarity with: o Data governance and cataloging tools: Collibra, Apache Atlas, Alation. o Lakehouse systems: Delta Lake, Apache Iceberg, Hudi. o MLOps or DataOps pipelines and platforms (e.g., MLflow, Kubeflow, TensorFlow). • Experience in regulated or domain-specific environments such as Hospitality Domain