We are looking for a highly skilled Senior Data Engineer with a strong background in TypeScript, data architecture, and large-scale data engineering. This role involves designing and building robust data pipelines, developing efficient data structures, and collaborating closely with global teams to deliver high-quality, reliable data solutions. You will play a key role in shaping our data infrastructure, improving scalability, and supporting data-driven decision-making across the organization.
The Key Responsibilities are:
- Design, develop, and maintain data pipelines and ETL workflows to support analytics and business operations.
- Implement and optimize data scraping systems, ensuring accuracy, performance, and compliance.
- Architect scalable data storage solutions and ensure data consistency across multiple environments.
- Collaborate with software engineers and data scientists to integrate data systems and streamline data access.
- Troubleshoot data issues and implement solutions for performance optimization and data quality improvement.
- Develop tools and scripts (mainly in TypeScript) to automate and monitor data collection and transformation processes.
- Maintain a solid understanding of network protocols (IP, HTTP, TCP) and apply them in managing data flows and integrations.
- Bachelor's degree in Computer Science, Information Systems, or a related technical field.
- 5+ years of professional experience in data engineering or a closely related role.
- Strong in TypeScript development (advanced knowledge required).
- Expertise in data structure design and problem-solving.
- Proven experience in data scraping, including handling APIs, web crawling, and large-scale data extraction.
- Solid understanding of IP, HTTP, and TCP protocols.
- Strong background in data architecture and data engineering principles.
- Experience with modern data technologies (e.g., SQL/NoSQL databases, cloud storage, data lakes, or data warehouses).
- Proficiency with ETL tools, data pipelines, and workflow orchestration frameworks (e.g., Airflow, Dagster, Prefect a plus).
- Familiarity with Python or other scripting languages is an advantage.
- Excellent communication and collaboration skills, with the ability to work independently in a remote setup.
- Flexibility to work US business hours (3:00/4:00 PM 12:00/1:00 AM Cairo time).
- Excellent English communication skills.
- This opportunity is remote for Egyptian applicants and Hybrid for Lebanese applicants.
Nice to Have:
- Experience with cloud platforms (AWS, GCP, or Azure).
- Exposure to data streaming tools (Kafka, Spark, etc.).
- Familiarity with data governance, security, and compliance best practices.
Disclaimer:
We use Zoho Recruit and other AI tools to help review and organize job applications (for example, reading CVs, reorganizing, and extracting CV information). These tools are only used to assist our team.
No hiring decision is made solely by automated means; all applications are subject to human review and assessment.