Company Description
Trade W is a leading multi-asset trading platform with over seven years of industry experience, providing global users with secure, convenient, and efficient access to the financial markets. We offer CFD trading across a wide range of asset classes including forex, cryptocurrencies, stocks, indices, metals, and commodities through our intuitive app and web platform.
Launched in 2018 as the flagship brand of Tradewill Global LLC, Trade W was built on a customer-first philosophy and a vision to make trading success more accessible. Today, we continue to grow as a trusted platform, committed to empowering traders worldwide with equal opportunities for success.
Job Summary
We are seeking a Data Development Engineer to build and maintain high-performance data infrastructure that supports risk control, quantitative research, trading analytics, and real-time monitoring.
Responsibilities
1. Data Warehouse Development
- Build domain data warehouses (orders, risk, users) using PostgreSQL / ClickHouse
- Design table structures, partitions, materialized views, and optimize queries
2. Real-time and Batch Data Pipelines
- Develop ETL/ELT pipelines using Kafka + Spark/Flink, Airflow/Dagster
- Handle market data, order flow, and risk-related data ingestion
3. Internal Data Services
- Develop internal data service APIs using Python or Java
- Support risk dashboards, quantitative research, and trading analysis
4. Data Visualization Support
- Collaborate with BI teams (FineBI, PowerBI, Tableau) to build analytical reports for risk, trading, and user behavior analysis
5. Data Quality and Monitoring
- Establish data validation, monitoring, and alerting mechanisms to ensure stability and consistency
6. Optional: On-chain Data
- Participate in blockchain data collection and modeling
- Integrate node RPC / block explorers to support DeFi and quantitative strategies
Requirements
Education and Experience
- Bachelor's degree or above in Computer Science, Software Engineering, Mathematics, Statistics, or related fields
- Graduated from Double First-Class / 211 / 985 / QS Top 200 universities preferred
- Minimum 3 years of experience in data development
Technical Skills
- Strong SQL skills, familiar with one of ClickHouse / PostgreSQL / kdb+
- Knowledge of OLAP concepts, partitioning, sharding, materialized views
- Experience with Kafka and Spark/Flink (at least one)
- Proficient in Python or Java with good engineering practices (logging, testing, Git)
- Experience with BI tools or data visualization; able to communicate with business, risk, and quant teams
Preferred Qualifications
- Experience with securities, futures, forex, or trading-related data
- Experience handling K-line, order book, or tick-level data modeling and optimization
- Experience in on-chain analytics (Dune, Flipside, The Graph, node RPC)
1.
2.
- Kafka + Spark/FlinkAirflow/Dagster
3.
4.
- BI (FineBI / PowerBI / Tableau)
5.
6. :
- SQL , ClickHouse / PostgreSQL / kdb+
- OLAP
- Kafka Spark/Flink
- Python Java,(Git)
- BI ,
- K Tick
- (DuneFlipsideThe Graph RPC )