For a customer we are looking for an experienced Data Engineer.
Start date: As soon as possible
Regime: Fulltime
Location: Brussels
Languages: French-speaking team, communication in English
Working environment:
You work within an organization that is digitizing its full functioning. As a Data Engineer you play a crucial role in organization-wide data transformation projects. You develop and automate data processing pipelines for data modeling, analysis and reporting from different data sources. You help set up the Delta Lake architecture and provide data-driven solutions.
Task description:
Build data streams for data acquisition, aggregation and modeling, using both batch and streaming paradigms.
Develop high-quality code for the core data stack including data integration hub, data warehouse and data pipelines under Azure service.
Collaborate with other developers as part of an SCRUM team to ensure joint team productivity.
Conducting data analysis, data profiling, data cleansing, data lineage, data mapping and data transformation.
Assisting in architectural design, developing, documenting and implementing end-to-end data pipelines and data-driven solutions.
Provide technical support for data problems with recommendations and solutions.
Critical analysis of information needs and help determine KPIs, set up monitoring and implement alerting at data level.
Profile description:
In possession of a Bachelor or a Master.
Minimum 3 years of experience in a similar position.
Strong in detail analysis and able to translate findings into insightful synthesis and implementation in a pragmatic way.
Must have: Knowledge and experience with Microsoft Azure Data Platform (Azure Delta Lake, Databricks, Azure Data Factory, Event Hub, Debezium).
Must have: Expertise in building ETL and data pipelines on Databricks using data engineering languages Python and SQL on Azure.
Experience with On-Premise Microsoft SQL Server (a.o. Transact SQL, stored procedures, Analysis Services, indexing, ...).
Must have: Experience with Azure Devops: CI / CD implementation, Automation Azure Data Factory Deployments.
Knowledge of BI concepts & implementation, preferably Star scheme modeling.
Good insight into the Microsoft Fabric suite and able to critically compare this technology with alternatives such as Databricks within a modern data architecture.
Being able to work independently, collegially and thinking along with the team.
Critical view and able to discuss substantive discussions at any level within the organization.
Being able to handle deadlines / priorities and working in an Agile way.