Search by job, company or skills

  • Posted 3 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Are you passionate about building high-performance data pipelines that power real applications, not just reporting

Do you thrive on transforming complex raw data into scalable, application-ready models within a cloud environment

Ready to take ownership of backend data architecture that enables APIs, integrations and next-generation digital solutions

MacGregor Black is currently partnering with a global consumer business, on the search for Data Engineer. This is a permanent role based in Dubai.

We are seeking a Data Engineer to design, build, and optimize the data transformation layer that powers enterprise applications. The role focuses on turning raw data into structured, application-ready datasets and data models, enabling APIs, integrations, and advanced digital solutions. This is a backend-focused role, not a reporting/BI position.

This is a rolling yearly contract position with strong long-term potential.

Key Responsibilities:

  • Design and implement data pipelines to process raw datasets into clean, structured formats.
  • Develop and maintain curated data layers optimized for system consumption.
  • Create reusable transformation logic and standardized processing frameworks.
  • Build data structures and schemas for application performance and integration.
  • Prepare data for backend applications, APIs, and AI/ML solutions.
  • Ensure low-latency, high-performance data access for downstream systems.
  • Design normalized and domain-based data models.
  • Implement partitioning, indexing, and tuning strategies for scalability and efficiency.
  • Optimize transformations for performance and cost within the cloud environment.
  • Apply data validation, cleansing, and reconciliation checks.
  • Maintain metadata, documentation, and data lineage.
  • Ensure security and access controls comply with enterprise standards.
  • Support enhancements, new initiatives, and evolving data requirements.
  • Monitor pipelines, troubleshoot issues, and ensure reliable data delivery.

What We're Looking For:

  • At least 6 years of experience in a similar role
  • Hands-on experience with cloud data platforms (e.g., Azure Data Lake, Synapse, Data Factory).
  • Strong SQL skills for complex transformations.
  • Experience designing data layers for application and API consumption.
  • Understanding of ELT architecture and data performance tuning.
  • Experience with Python or PySpark.
  • Knowledge of transactional/Delta Lake frameworks.
  • Familiarity with event-driven or near-real-time data processing.

For more information, please contact Scott McGowan.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145037053

Similar Jobs