Data Lakehouse Report Engineer

April 15, 2026
Application ends: July 14, 2026
Apply Now

Job Description

About the Role:

As a key member of our Data Foundations Team, you will be instrumental in building and maintaining data pipelines, ingesting legacy data, and developing semantic data models within a cutting-edge Microsoft Fabric Lakehouse environment. Your expertise will enable effective reporting and data governance uplift as part of a broader systems transformation.

Your Responsibilities:

– Develop and maintain scalable data ingestion pipelines

– Extract legacy data from systems such as One Housing

– Load curated data-sets into Microsoft Fabric Lakehouse

– Design semantic and curated data models

– Deliver Power BI reporting solutions aligned with business needs

– Collaborate with vendor partners on technical delivery

– Translate business requirements into efficient data models

What We Are Looking For:

– Strong background in data engineering and ETL/ELT pipeline development

– Hands-on experience with Microsoft Fabric (minimum 1-1.5+ years preferred)

– Proficiency in Python, PySpark, SQL, and data modelling

– Experience with data pipeline orchestration and warehousing principles

– Knowledge of Power BI reporting

– Familiarity with legacy source systems and migration environments

– Transferable experience with Azure, AWS, Snowflake, or Databricks is a plus

Ideal Candidate Profile:

– Demonstrated capability in building and maintaining data pipelines

– Experience developing semantic layers and curated data-sets

– Ability to work closely with stakeholders and cross-functional teams

– Exposure to housing/property systems or non-profit sectors is advantageous but not essential

Are you interested in this position?

Apply by clicking on the “Apply Now” button below!

#GraphicDesignJobsOnline

#WebDesignRemoteJobs #FreelanceGraphicDesigner #WorkFromHomeDesignJobs #OnlineWebDesignWork #RemoteDesignOpportunities #HireGraphicDesigners #DigitalDesignCareers# Dynamicbrand guru