Search by job, company or skills

  • Posted 4 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

We have partnered with a fast-growing fintech redefining how financial data is collected, processed and used to make smarter, faster decisions. As they scale their data platform to meet increasingly complex demands, they are looking for a Senior Data Engineer to take ownership of the architecture and pipelines that sit at the heart of their business.

This is not a maintenance role; it is a greenfield opportunity to build modern, high-performance data infrastructure using the tools and practices shaping the industry right now.

If you are the kind of engineer who gets excited about real-time data, clean architecture and working in an environment where your decisions actually matter, this is the role for you.

About the role:

  • Design, build and optimise scalable data pipelines handling high-volume, high-velocity financial data in real time
  • Architect and maintain a modern data platform leveraging tools such as dbt, Apache Kafka, Spark and cloud-native services
  • Implement and evolve data lakehouse patterns, working across structured and unstructured data at scale
  • Drive adoption of DataOps practices including pipeline testing, observability, lineage tracking and automated data quality frameworks
  • Collaborate with data scientists, analysts and product teams to deliver data products that create tangible business value
  • Champion engineering best practices across version control, CI/CD, documentation and peer review
  • Contribute to platform and tooling decisions, with a genuine seat at the table on architectural direction

About you:

  • 4 to 5 years of experience in data engineering, with a track record of building and scaling production data pipelines
  • Strong hands-on experience with modern data stack tooling, dbt, Airflow, Kafka, Spark or equivalent
  • Proficiency in Python and SQL, with an understanding of software engineering principles applied to data
  • Experience with cloud platforms such as AWS, GCP or Azure and familiarity with lakehouse architectures such as Delta Lake or Apache Iceberg
  • Exposure to real-time or streaming data processing in a production environment
  • A solid understanding of data modelling, schema design and the principles of well-structured, trustworthy data
  • Comfortable working in agile, fast-moving environments where priorities evolve quickly
  • Fintech, banking or financial services experience is advantageous

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 145116401

Similar Jobs