The role of a Data Platform Technical Lead is crucial in ensuring the smooth operation and optimization of data environments used by both expert and novice data scientists.
As a key member of the team, you will play a pivotal part in defining and managing the Dataiku platform, including migration, integration with key applications, and automating pipelines. Additionally, you will be responsible for managing and supervising a dedicated support team, acting as the technical lead on Dataiku-related matters, and driving the execution of critical activities within the platform.
Your Responsibilities
* Defining and Applying Best Practices: You will consistently apply best practices within the Dataiku platform to ensure its optimal performance and efficiency.
* Detecting Automation Opportunities: Identify opportunities for automation and develop them to streamline processes and enhance productivity.
* Communicating Complex Ideas: Explain Dataiku capabilities to non-technical profiles while engaging in technical conversations with engineers and data scientists.
* Staying Up-to-Date: Stay current with modern data platform and automation trends to maintain a competitive edge.
* Collaboration and Teamwork: Work individually on smaller deliveries, but also participate in larger deliveries as part of a team.
* Raising Impediments and Seeking Support: Identify and address impediments, seeking support when necessary to ensure project success.
* Suggesting Improvements: Actively suggest improvement opportunities, both technical and process-related, to enhance the overall quality and efficiency of the platform.
* Documentation and Knowledge Sharing: Create concise and meaningful documentation of policies, best practices, and architectures to facilitate knowledge sharing and collaboration.
Your Profile
We are looking for an individual with at least 3 years of experience working as a data platform technical expert or MLOps role. Your expertise should include supervised and unsupervised machine learning (classification/clustering, regression, pattern recognition), with a strong focus on workflow optimization using Dataiku. Familiarity with automation techniques, integration with enterprise systems, monitoring techniques, and key metrics for platform performance is essential. Excellent programming skills in Python, as well as familiarity with common DevOps tools like Git and Terraform, are required. Additionally, good understanding of AWS infrastructure and deployment practices, language skills in English, and a Master's degree or PhD in applied Physics, Mathematics, Computer Sciences, or equivalent are highly desirable.
Success Factors
* Strong Communication Skills: Ability to convey complex ideas clearly and persuasively.
* Teamwork and Collaboration: Excellence in collaborating and building meaningful relationships.
* Organizational Talent: Maintain oversight and prioritize effectively in a dynamic environment.
* Problem-Solving Ability: Think creatively and tackle challenges proactively.
What We Offer
In addition to a full-time position with attractive secondary benefits, we offer a competitive salary package, including a company car, laptop, meal vouchers, bonus system, and more. Our training and development program provides access to courses and certifications to enhance your skills. We also offer flexibility through our hybrid work model, which supports work-life balance. Regular company outings and events celebrate our successes together.
Our recruitment process includes several stages: a first screening via phone, an initial interview with a Talent Acquisition Professional, a discussion with one of our People Managers to ensure the right fit, and a meeting with our Technical Colleagues to assess your skills and experience. Capgemini Engineering is a global business and technology transformation partner helping organizations accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society.