Search by job, company or skills

Bupa Arabia

Manager - Data Engineering

new job description bg glownew job description bg glownew job description bg svg
  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

Role Purpose:

The role involves designing and implementing robust data ingestion pipelines, enable SAS Viya Fraud, Waste, and Abuse (FWA) data models, and optimize large-scale data workloads in Google BigQuery. The ideal candidates will have strong experience in ETL/ELT development using Informatica IDMC, cloud data platforms, and healthcare data models.

Key Accountabilities:

1- Enterprise Data Ingestion and Data Engineering;

  • Build reusable, parameterized mappings/taskflows.
  • Implement CDC, idempotent loads, schema evolution, DQ gates.
  • Optimize BigQuery loads (partition/cluster, load vs. stream).
  • Set up version control and CI/CD.

2- Data Mapping & Transformation Design;

  • Profile sources; define field-level mappings and rules.
  • Specify business logic, joins, lookups, derivations.
  • Define DQ rules, exceptions, and rejects.

3- Orchestration, Automation & Reliability Engineering;

  • Parameterize taskflows; set schedules/dependencies.
  • Implement retries/backoff and checkpointing.
  • Integrate monitoring/alerts (Ops console, ChatOps).

4- Data Roles and Privacy management;

  • Define leastprivilege IAM and service accounts.
  • Apply dataset/table/row/column security & masking.
  • Enable audit logs and retention; classify PHI/PII.

5- SAS Viya FWA Domain Modeling & Curation;

  • Gather model inputs, features, windows, granularity.
  • Align scoring logic and feature engineering steps.
  • Design curated FWA datasets (claims, eligibility, providers, members, preauths).

6- Workload Monitoring, Performance & Cost Optimization;

  • Track freshness, job health, volumes, anomalies.
  • Monitor SLAs (duration, errors, cost/TB, slots).
  • Tune BigQuery.
  • Optimize cost (storage lifecycle, query tuning, caching).

Skills

  • Data Engineering Tools: Informatica IDMC, Airflow, Apache Spark.
  • Programming Languages: SQL, Python.
  • Analytics Platforms: SAS Viya, SAS Enterprise Guide.
  • Cloud Technologies: Google Cloud Platform (BigQuery, IAM), Azure/AWS (optional).
  • Visualization Tools: Power BI, Tableau.
  • Strong in BigQuery (SQL, modeling, tuning).
  • Proficient with Informatica IDMC/IICS ETL/ELT.
  • Experience with enterprise pipelines.
  • SAS Viya (FWA) exposure is a plus.
  • Knowledge of healthcare data (claims, providers, members).
  • Understanding of cloud security (IAM, service accounts, private endpoints).

Education

Bachelor's degree in Computer Science, Information Technology, Information Systems, or a related

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 143046633