Search by job, company or skills

Saudi Airlines

Manager, Digital Test Center of excellence

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Role Purpose:

The Manager, Digital Testing leads and governs end-to-end testing across Saudia's digital platforms — from test strategy and framework design through to daily test execution, defect management, and release sign-off. The role is the single accountable owner for all testing activities, including automation architecture, test bank management, test data provisioning, regression testing, performance and security testing, UAT coordination, test environments, and test tooling. The function operates as a specialized engineering discipline within the Digital Operations & Quality division, ensuring every digital release is thoroughly validated before it reaches customers.

The role's responsibilities are organized across four key leadership pillars:

  • Test Strategy, Framework Design & Automation Architecture
  • Test Execution & Defect Management
  • Performance, Security & Specialist Testing
  • Test Data, Environments & Tooling

Key Responsibilities:

Test Strategy, Framework Design & Automation Architecture

  • Define, own, and continuously evolve the Digital Testing Strategy, covering test types, automation principles, coverage targets, risk-based prioritization, and testing standards across all digital platforms and delivery teams.
  • Design and govern the test automation framework architecture across web, mobile, API, and end-to-end test layers — selecting and owning tooling (Selenium, Cypress, Playwright, Appium, REST-Assured, or equivalent) aligned to the digital technology stack.
  • Build and maintain the enterprise test bank — a versioned, structured repository of regression scripts, reusable test components, page objects, data fixtures, and API test collections — ensuring reuse, version control, and consistency across delivery streams.
  • Govern automation framework versioning, dependency management, and technical debt, ensuring the automation estate remains maintainable, extensible, and free of flaky or obsolete tests.
  • Define automation coverage targets by platform and delivery stream, tracking progress and reporting automation maturity to the GM, Digital Operations & Quality.
  • Introduce AI-assisted testing where relevant (e.g. test generation and regression triage) to improve efficiency and coverage. Define acceptance criteria standards and BDD/TDD framework guidance, enabling product and engineering teams to embed testability into requirements from the outset.
  • Ensure testing standards align with security, data governance (PDPL), and accessibility (WCAG 2.1 AA).

Test Execution & Defect Management

  • Lead all test planning and execution across functional, regression, integration, API, and exploratory test cycles for all active delivery streams, ensuring test cycles are resourced and completed within release timelines.
  • Own the full defect lifecycle: initial triage and severity classification, prioritization and resolution accountability, regression verification, root-cause analysis for critical defects, and systematic prevention of recurring production escapes.
  • Enforce quality gate compliance within CI/CD pipelines, ensuring no release progresses to production without meeting defined coverage thresholds and defect resolution criteria.
  • Produce release testing sign-off reports, documenting test coverage, open defects, residual risks, and formal go/no-go recommendation for each production deployment.
  • Coordinate test execution resource planning across multiple parallel delivery streams, balancing capacity against release priorities and escalating resourcing risks in advance.
  • Lead post-release defect retrospectives, identifying systemic testing gaps and driving targeted improvements.
  • Support quality audits by providing testing evidence and implementing corrective actions.

Performance, Security & Specialist Testing

  • Own the full performance testing program: defining and governing load profiles, stress scenarios, and response time thresholds for critical digital platforms including the booking engine, check-in journeys, loyalty portal, and API gateway.
  • Lead performance test execution ahead of major releases, peak season campaigns (Hajj, Ramadan, major promotions), and significant platform changes — producing evidence-based performance sign-off reports.
  • Own DAST / SAST execution cycles in coordination with cybersecurity — including scheduling, finding triage, remediation tracking, and closure verification.
  • Govern accessibility testing execution, ensuring digital platforms are validated against WCAG 2.1 AA standards before production release.
  • Lead and govern the execution of cross-browser, cross-device, and localization testing to ensure consistent digital experience quality.

Test Data, Environments & Tooling

  • Own test data provisioning end-to-end: define test data requirements per test type, manage data creation and masking pipelines, maintain refresh schedules, and ensure all test data is compliant with PDPL and data governance requirements.
  • Own test environment lifecycle management — coordinating with Infrastructure Operations and Release Management to ensure development, integration, staging, and UAT environments are available on schedule, stable, and representative of production configuration.
  • Define and enforce environment usage policies including booking protocols, data refresh schedules, configuration parity requirements, and environment retirement processes.
  • Own the digital testing tooling ecosystem: maintain the tooling register, lead evaluation and selection of new tools, govern license utilization, and optimize tooling usage and reduce duplication where appropriate.
  • Govern test tooling integration into CI/CD pipelines, ensuring automated testing is reliable, efficient, and effectively gates deployments.
  • Produce quarterly reporting on test environment health, test data availability, tooling adoption, and optimization opportunities.

UAT Coordination & Stakeholder Testing

  • Define and operate the User Acceptance Testing (UAT) program, ensuring structured test planning, stakeholder engagement, acceptance criteria validation, and traceable sign-off for all major releases.
  • Coordinate UAT activities across business stakeholders, product owners, and end-user representatives — maintaining clear accountability for test execution, defect resolution, and formal sign-off authority.
  • Produce UAT summary reports documenting test coverage, open defects, stakeholder sign-off status, and accepted risks for production deployment.
  • Build UAT capability within business stakeholder teams, providing templates, tooling access, training, and coordination support, ensuring consistent and effective UAT execution.

Reporting, Governance & Continuous Improvement

  • Produce weekly and monthly testing performance dashboards covering defect escape rate, automation coverage, test execution progress, performance test results, environment availability, and UAT sign-off status, providing clear visibility of quality and release readiness.
  • Lead testing retrospectives after each major release cycle, capturing process gaps, defect patterns, and automation opportunities.
  • Track and improve DORA quality metrics including change failure rate and defect detection efficiency.
  • Champion shift-left testing practices, embedding testing from requirements definition through to deployment across all delivery streams.

Qualifications:

  • Bachelor's degree in computer science, Software Engineering, Information Technology, or a related technical discipline.

Experience:

  • Minimum 8+ years of experience in test engineering, test automation, or quality assurance, covering strategy, execution, and tooling.
  • At least 3+ years in a leadership role with direct accountability for testing teams and test execution programmes.
  • Demonstrated experience owning an end-to-end testing function including framework design, test bank management, performance testing, UAT, and test environment governance.

Preferred Certifications & Differentiators

  • ISTQB Certified Tester — Advanced Level Test Manager or Test Automation Engineer strongly preferred.
  • ITIL v4 Foundation — required.
  • Certified Agile Tester (CAT) or equivalent agile quality certification.
  • Performance testing certification (BlazeMeter, k6) — advantageous.
  • Microsoft Azure Fundamentals (AZ-900) or equivalent cloud awareness certification.
  • Experience testing airline digital platforms, PSS integrations, or travel booking engines is a strong differentiator.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147375575