Role Proficiency:
This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions.
Outcomes:
Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions.Measures of Outcomes:
Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches.Outputs Expected:
Code Development:
Develop data processing code independentlyensuring it meets performance and scalability requirements. Define coding standards
templates
and checklists. Review code for team members and peers.
Documentation:
checklists
guidelines
and standards for design
processes
and development. Create and review deliverable documents
including design documents
architecture documents
infrastructure costing
business requirements
source-target mappings
test cases
and results.
Configuration:
Testing:
scenarios
and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed.
Domain Relevance:
demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise.
Project Management:
Defect Management:
Estimation:
Knowledge Management:
SharePoint
libraries
and client universities. Review reusable documents created by the team.
Release Management:
Design Contribution:
low-level design (LLD)
and system architecture for applications
business components
and data models.
Customer Interface:
Team Management:
Certifications:
Skill Examples:
Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components.Knowledge Examples:
Knowledge Examples
Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering.Additional Comments:
POSITION DESCRIPTION Risk Data and Analytics Expert Analyst SUMMARY: The Risk Data and Analytics Expert Analyst is responsible for comprehensive data mining, management and manipulation techniques to report on Medicare Advantage (MA) and Commercial Qualified Health Plan (QHP) , assisting and offering assistance in organizing and facilitating strategic program creation and execution , with minimal direction from Principal BI Analyst, Supervisor, or leadership. This role assists in researching and compiling appropriate and relevant data and feedback for risk activities. RESPONSIBILITIES/TASKS: • Researches, analyzes, identifies, and evaluates data from assigned problems to evaluate existing and potential trends and issues. • Generating various reports using Claims, Membership, MMR,RAPS, MAO, RACSD files. • Create data insights from cloud using Data science technics by using Python or SQL • Strong US healthcare knowledge in Member, provider, claims domains • Knowledge in Risk Adjustment/ Quality initiatives/programs preferred • Ability to understand E2E flow of healthcare data models • Possesses and maintains comprehensive knowledge of MA/ACA/Medicaid business, products, programs (including provider data, networks, etc.), corporate organizational structure (including functional responsibilities), and basic research principles/methodologies. • Designs, develops, tests, and delivers solutions comprising of components, reports, and data stories per requested deliverable directions with minimal guidance from senior team members or leadership. • Assists in development of lines of communication to discuss/review results of analysis to management via reports/presentations and assists management in implementing programs that provide solutions. • Assists leadership by investigating, reviewing, and recommending innovative solutions which identify problems/root cause of issues. • Assists in identifying resolution of challenges and issues in order to fulfill key corporate objectives and responds to the demands of change management and initiates actions needed to plan, organize, and control team activities. • Assists with and documents feedback between corporate business areas and participates in group or committee discussions. This position description identifies the responsibilities and tasks typically associated with the performance of the position. Other relevant essential functions may be required. EMPLOYMENT QUALIFICATIONS: EDUCATION: Bachelor’s degree in Business Administration, Economics, Health Care, Data Analytics, Data Science, Information Systems, Statistics, or a related field. Relevant combination of education and experience may be considered in lieu of degree. Continuous learning, as defined by the Company’s learning philosophy, is required. Certification or progress toward certification is highly preferred and encouraged. EXPERIENCE: Four (4) – Six (6) years of experience in a related field required to provide the necessary knowledge, skills, and abilities for the role. Senior Analyst experience required with proven experience in operational analysis, data analysis, and problem resolution type activities. SKILLS/KNOWLEDGE/ABILITIES (SKA) REQUIRED: • Strong experience using Data science technics to create data stories to provide data insights to leadership. • Strong knowledge on data mining is required. • Strong execution in a fast-paced environment with tight deadlines. • Administer and adhere to corporate and departmental policies, practices and procedures. • Strong analytical, planning, problem solving, verbal, and written skills to communicate complex ideas. • Strong ability to learn new technology, techniques, and processes. • Strong knowledge and use of existing software packages (Power BI, PowerPoint, Excel, Word, etc.). • Strong knowledge of data languages such as SQL, T-Sql, Python, SAS etc. • Strong analytical, organizational, planning, and problem-solving skills. • Affirmation from leadership based upon delivered MA,ACA,Medicaid solutions. • Strong understanding of and ability to apply statistical inference. • Strong ability to read and interpret documents such as safety rules, operating and maintenance instructions, and procedure manuals. Strong ability to write reports and correspondence. Strong ability to speak effectively before groups of customers or employees of organization. • Strong ability to define problems, collect data, establish facts, and draw valid conclusions. Strong ability to interpret an extensive variety of technical instructions in mathematical or diagram form and deal with several abstract and concrete variables. • Strong ability to develop project management, meeting process, and presentation skills. • Strong ability to work independently, within a team environment, and communicate effectively with employees and clients at all levels. • Other related skills and/or abilities may be required to perform this job.