Applied Research Quality Management System (QMS) and Stage-Gates
Design, implement, and maintain an applied research Quality Management System aligned to HCT's applied research context.
Define lifecycle stage-gates (pre-award, award acceptance, kick-off, execution reviews, reporting, close-out), including minimum quality criteria for each gate.
Establish standard operating procedures (SOPs), checklists, and quality standards for project artifacts (plans, risk registers, progress updates, deliverables acceptance, close-out packs).
Project Delivery Assurance (PMO Functions Embedded into QA)
Function as the day-to-day delivery assurance lead across the applied research portfolio, ensuring consistent project planning and controls.
Coach PIs and project teams on work breakdown structures, schedules, milestones, KPIs, risk registers, change control, and reporting discipline.
Research Operations Quality (Pre-Award and Post-Award Service Quality)
Standardize and improve pre-award support quality: proposal completeness checks, budget accuracy validation support, sponsor requirement interpretation support in partnership with operations and compliance.
Standardize and improve post-award service quality: grant set-up readiness, milestone tracking consistency, reporting quality, and close-out completeness, ensuring smooth execution for faculty.
Digital Quality and Data Integrity in RMS and Project Tooling
Define and enforce data quality standards for core applied research records (projects, milestones, deliverables, evidence attachments, reporting fields).
Partner with RMS owners and administrators to implement workflow controls that support quality stage-gates, required artifacts, and portfolio reporting.
Maintain a practical template and toolset library (project templates, dashboards, trackers, risk logs), ensuring consistent usage across campuses.
KPI Frameworks, Dashboards, and Portfolio Performance Reporting
Own applied research delivery and quality KPIs (on-time milestones, deliverable acceptance rates, reporting punctuality, rework rates, cycle times, service SLAs, portfolio health).
Produce portfolio dashboards and consolidated leadership reporting, including sponsor-ready progress summaries when required.
Provide structured insights that support institutional reporting, accreditation narratives, and decision-making for resourcing and prioritization.
Continuous Improvement, Training, and Capability Building
Run continuous improvement cycles (lessons learned, root-cause analysis of delays and rework, template updates, process refinements).
Develop and deliver short trainings and onboarding for faculty, staff, and student teams on applied research delivery standards and tools.
Build a quality culture that increases predictability, reduces avoidable rework, and improves faculty experience with research support services.