- Role: Machine Learning Learning - Remote
- Location: United Arab Emirates
- Type: Flexible Hourly Contractor
Project Overview
Work on advanced AI model evaluation initiatives. Apply machine learning engineering expertise to review implementations, document technical reasoning, and support the development of high-quality training datasets for coding and reasoning systems.
What we're looking for
- A Bachelor's, Master's, or PhD in Computer Science, Electrical/Computer Engineering, Applied Mathematics, or a related computational field.
- 3+ years of hands-on experience developing and optimising deep learning models in PyTorch (e.g., Transformers, CNNs, diffusion models).
- Strong understanding of model architecture, training dynamics, and inference optimization.
- Familiarity with GPU performance concepts, such as memory I/O efficiency, CUDA kernels, or PyTorch profiling tools.
- The ability to read and reason about research-level model code and articulate detailed technical feedback.
- Clear written communication and an eye for detailable to describe complex ML behaviours and trade-offs precisely.
Ideal Background
Experience in one or more of the following roles:
- Machine Learning Engineer
- Data Scientist
- Software Engineer (ML focus)
- Data Engineer
Key Responsibilities
- Code & Architecture Evaluation: Review PyTorch/TensorFlow implementations of Transformers, CNNs, and attention mechanisms for efficiency and production readiness.
- Technical Reasoning Documentation: Document expert decision-making for ML architecture choices, hyperparameter tuning, and AWS/Docker deployment trade-offs.
- Prompt & Scenario Creation: Author evaluation prompts testing ML system design, MLOps workflows, and framework-specific optimisation strategies.
- Data Pipeline Review: Evaluate Pandas/SQL preprocessing, Airflow orchestration, FastAPI services, and Git/CI-CD workflows for production ML systems.
- Core Technical Skills: Python (Pandas/NumPy) + PyTorch/TensorFlow/Scikit-learn + AWS/Docker/LangChain for modern ML engineering workflows.
Sample Weekly Workflow (20 Hours Example)
- Monday: Code reviews and AWS evaluations
- Tuesday: Prompt development and LangChain scenarios
- Wednesday: Quality review and structured feedback
- Thursday: Architecture annotations and pipeline assessments
- Friday: Complex evaluations and documentation
- Weekend: Priority or overflow tasks (if required)
Apply Now!