Responsibilities
Design and implement enterprise-wide data architecture frameworks that support cross-division data integration and usability.Develop scalable data pipelines and workflows (ETL/ELT) to support analytics, reporting, and machine learning workloads.Collaborate with stakeholders across business units to identify data requirements and consolidate sources.Lead efforts in data modeling, documentation, and establishing common data definitions and practices.Define and implement data governance, data quality, and security frameworks aligned with compliance requirements.Evaluate and recommend modern data technologies to support ongoing evolution of data infrastructure.Partner with data engineers, analysts, and scientists to ensure data availability and usability.Contribute to team growth by mentoring junior team members and promoting best practices in data engineering and architecture.Must-Have Qualifications
Bachelor’s degree in Computer Science, Information Systems, or a related field.Proven experience as a Data Architect, Senior Data Engineer, or in a similar data-focused role.Strong knowledge of database management systems, both SQL (e.g., Azure SQL, PostgreSQL) and NoSQL (e.g., Cosmos DB, MongoDB).Proficiency in data modeling, data warehousing, and ETL/ELT pipelines using tools such as Apache Airflow, or Azure Data Factory.Experience with big data technologies such as Apache Spark, Databricks, or similar.Familiarity with cloud data platforms, preferably Microsoft AzureStrong understanding of data governance, data quality, and data security practices.Familiarity with containerization and orchestration tools like Docker and Kubernetes.Understanding of CI/CD pipelines and DevOps practices for data workflows.Knowledge of data privacy regulations such as GDPR, CCPA, or similar compliance frameworks.Nice-to-Have Qualifications
Experience with AI and machine learning frameworks (e.g., TensorFlow, PyTorch, Azure ML).Knowledge of data visualization tools (e.g., Power BI, Tableau, Looker) and how to support reporting layers.Exposure to real-time data processing and streaming technologies (e.g., Kafka, Azure Event Hubs, Stream Analytics).Hands-on experience with Python, PySpark, or similar scripting languages for data transformation.LexisNexis, a division of RELX, is an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form: https://forms.office.com/r/eVgFxjLmAK , or please contact 1-855-833-5120.
Please read our Candidate Privacy Policy.