Job Description
What You Will Do :
– Design, build, and maintain end-to-end ETL/ELT pipelines using Azure Data Factory, Azure Databricks, and related Azure data services.
– Ingest, transform, and curate data from internal systems such as CRM, ERP, finance, and operational platforms.
– Implement and optimize modern lakehouse architectures using a Databricks and Microsoft Fabric model.
– Build and manage Bronze, Silver, and Gold data layers using Databricks and Delta Lake best practices.
– Enable analytics and reporting through Microsoft Fabric OneLake and Power BI using Direct Lake mode.
– Develop Power BI dashboards, Fabric semantic models, and tabular models that support business decision making.
– Collaborate closely with US-based business users, analysts, and architects to gather requirements and deliver solutions.
– Participate in regular meetings with US teams and align on priorities, timelines, and technical decisions.
– Provide status updates, identify risks, and communicate clearly in a global delivery model.
– Align working hours as needed to support overlap with US Central Time.
– Implement data quality checks, monitoring, governance, and cloud security best practices.
– Follow DevOps and DataOps practices including CI/CD pipelines and Git-based source control.
– Mentor junior engineers and contribute to technical standards, architecture decisions, and code quality.
What We Are Looking For :
– 5- 8+ years of hands-on data engineering experience
– Strong SQL skills including T-SQL, Spark SQL, or PL/SQL.
– Proficiency in Python and PySpark with experience on Apache Spark.
– Hands-on experience with Azure Data Factory, Azure Databricks, and core Azure data services.
– Strong understanding of data lake and lakehouse architectures, including Delta Lake and medallion patterns.
– Experience with data modeling including star schemas and semantic or tabular models.
– Working knowledge of CI/CD pipelines, Git, and automated deployments for data workloads.
– Experience implementing data governance, data quality, access control, and security.
– Experience working with US or other international teams in a distributed environment.
– Strong verbal and written communication skills.
– Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).
Nice to Have :
– Experience with machine learning or advanced analytics using Python or R.
– Exposure to open-source data technologies such as Kafka, Hadoop, or NoSQL databases.
– Microsoft Azure Data Engineer or Databricks certifications.
– Hands-on experience with Power BI Direct Lake, Fabric Dataflows Gen2, or Copilot for Data Factory.
Are you interested in this position?
Apply by clicking on the “Apply Now” button below!
#GraphicDesignJobsOnline #WebDesignRemoteJobs #FreelanceGraphicDesigner #WorkFromHomeDesignJobs #OnlineWebDesignWork #RemoteDesignOpportunities #HireGraphicDesigners #DigitalDesignCareers #Dynamicbrandguru
Apply Now