Job Purpose:
Design, build, and maintain scalable data pipelines and infrastructure to support Modupay's intelligence and insights products. Ensure data reliability, availability, and quality across the organization, with a focus on supporting payments and fintech-related data needs.
Key Accountabilities:
- Develop, and maintain robust data pipelines that ingest, transform, and normalize large volumes of transaction data from multiple sources
- Implement incremental loading, error handling, retry logic, and pipeline monitoring to ensure SLA-driven data freshness for downstream commercial products.
- Build unified data models that normalize transaction data across payment schemes and local rails into consistent reporting structures, enabling cross-market and cross-product analytics.
- Implement data quality, validation, and testing frameworks (data contracts, automated checks, anomaly detection) to ensure integrity of data flowing into analytics products, reporting, and machine learning use cases.
- Ensure data quality, integrity, and consistency through monitoring, validation, and testing frameworks.
- Collaborate with data analysts and internal stakeholders to understand data requirements and deliver scalable solutions.
- Implement and maintain data governance, security, and privacy best practices in compliance with Modupay's policies and regulatory requirements.
- Identify performance bottlenecks and optimize query performance, data storage, and processing efficiency.
- Stay current with emerging technologies and industry trends, recommending improvements to the data architecture and tooling
Knowledge and Competencies:
Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field. Master's degree is a plus.
- Advanced SQL proficiency. Experience with PostgreSQL and Oracle preferred.
- 3 years of experience in data engineering preferably within payments, fintech, or financial services.
- Strong data modeling skills
- Hands-on experience with ETL/ELT frameworks and tools
- Proficiency in Python or another data engineering language for pipeline development, scripting, and automation.
- Experience with data quality and testing frameworks
- Exposure to cloud data platforms (AWS, GCP, or Azure)
- Experience with BI/visualization tool integration
- Strong analytical and problem-solving skills with attention to data accuracy and reconciliation.
- Effective communicator, able to translate technical data architecture decisions into business-understandable terms for analysts and product stakeholders.
- Fluency In English