As a Senior Data Engineer, you will support and guide our client in the design, implementation, and optimization of modern data platforms based on Microsoft Fabric.
Responsibilities
Design, build, and orchestrate data ingestion pipelines using Azure Data Factory and Dataflows Gen2
Develop and maintain scalable ETL/ELT processes
Optimize Spark workloads using Microsoft Fabric resource profiles (read-heavy, write-heavy, hybrid)
Design and manage Lakehouses and Delta tables
Build and optimize SQL Warehouse schemas
Implement and maintain medallion architecture (Bronze / Silver / Gold)
Ensure data quality, performance, and reliability across the platform
Collaborate with cross-team (BI, data science, business)
Produce clear technical documentation and support architectural governance
Profile
Minimum 5 years of experience in Data Engineering or a similar senior role
Strong experience with Microsoft Fabric (Lakehouse, Warehouse, Pipelines, Notebooks)
Onelake data management
Strong problem-solving and debugging skills
Ability to work in cross-functional teams
Strong communication, documentation, and architectural thinking skills
Fluency in English is required
Technical skills:
Spark (PySpark, Scala, SQL)
Delta Lake & Lakehouse architecture
Azure Data Factory pipelines
SQL (T-SQL required, DAX is a plus)
GIT and CI/CD practices
REST APIs and service principals
Why cronos group?
Cronos Group is a leading IT services and consulting company, offering innovative solutions and long-term partnerships with major institutions and enterprises.
We'll propose you:
An attractive salary package
A good work-life balance environment
The assurance of working with cutting-edge technologies in an entrepreneurial spirit
The opportunity to develop your skills through tailor-made training courses according to your needs
A good job in a friendly and professional environment
If you wish to integrate a dynamic structure on a human scale while working with the latest technologies, don't wait anymore and join Cronos!