Devoteam is an AI-driven tech consulting firm specializing in cloud platforms, cybersecurity, data solutions, and sustainability. With nearly 30 years of tech-native expertise, we guide businesses through sustainable digital transformations to deliver tangible value.
Present in over 25 countries across Europe, the Middle East, and Africa, and powered by over 11,000 collaborators, Devoteam is committed to leveraging technology to serve people. This commitment is strengthened by our strategic partnerships with leading cloud platforms like AWS, Google Cloud, Microsoft Azure, and ServiceNow.
In Luxembourg, we are a strong workforce of almost 150 professionals operating from 2 modern offices, including an Innovative Tech Department: a focused group of around 30 specialists in cutting-edge technologies. Embracing a vendor-agnostic approach, this team cultivates deep expertise across diverse cloud providers, open-source technologies, and other innovative tools, enabling us to craft tailored solutions that best fit our clients' needs.
Description du posteAs a Data Engineer at Devoteam, you will be instrumental in designing, building, and maintaining robust data infrastructures. You will work with cutting-edge technologies to transform raw data into actionable insights, enabling our clients to make informed decisions and drive innovation. We are looking for a passionate and skilled individual to join our Innovative Tech team and contribute to challenging and impactful projects.
Key Responsibilities
Development and optimization of ETL/ELT pipelines (batch & streaming);Integration of data from structured and unstructured sources;Design and management of relational and NoSQL databases;Implementation of cloud-based data architectures (Data Lakehouse, Delta Lake, etc.);Data modeling to support analysis and decision-making;Ensuring data quality, security, governance, and performance;Close collaboration with Data Science, BI, Product, and Architecture teams.QualificationsMust-have
Technical Proficiency: Strong Python (pandas, NumPy, Polars, DuckDB) expertise along with workflow orchestration with Airflow and Dagster, and handling data storage/formats such as Parquet, Delta Lake, S3, and ADLS;Data processing : Experience using Kafka, Hadoop, Spark Streaming, and Flink;Database Expertise: Knowledge of relational and NoSQL databases including PostgreSQL, MySQL, SQL Server, MongoDB, and Cassandra;Cloud Platforms: Hands-on experience with cloud data platforms like Databricks, Snowflake, GCP BigQuery, and AWS Redshift;Agile Thinker & Communicator: A true team player with excellent communication skills, a knack for solving complex problems with a particular attention to detail, and the drive to take ownership and initiative.Nice to have
Experience in developing and deploying machine learning models, and understanding of AI frameworks and libraries like TensorFlow, PyTorch, and Scikit-learn;Certifications such as Google Cloud Professional Data Engineer or Google Cloud Machine Learning Engineer;Experience in regulated environments (e.g., financial services, healthcare, critical infrastructure);Knowledge of Luxembourg's regulatory environment, including CSSF regulationsInformations complémentaires