Senior Data Quality Engineer ~  Deriv ~  Dubai, United Arab Emirates 

Full-time · Mid-Senior level


About the job

Deriv Group is one of the largest brokers in the world, with over 1,200 team members and 20 global offices.

In a journey spanning more than 22 years, we have designed and implemented proprietary trading products and services, growing our markets to over 2.5 million customers worldwide. Our mission has remained the same throughout: to make trading accessible to anyone, anywhere. Deriv Life and Deriv Careers illustrate our focus on client satisfaction, excellence, innovation, and, most importantly, integrity. Join us to experience the excitement of a fast-paced, collaborative environment where we challenge ourselves to push the boundaries of what’s possible in the fintech industry.

Your role

As a Senior Data Quality Engineer in the Business Intelligence team, you will perform automated and manual testing of data sets used in our internal data systems. Working with a passionate group of Data Engineers, you will write testing plans and routines and manage continuous integration and regression testing processes. You will be the key contact point for clearing release data per delivery specifications. Another important responsibility will be developing, testing, and maintaining architectures for data processing and building Extract, Transform, and Load (ETL) pipelines.

One of Deriv’s goals is to become the world’s leading online broker; to achieve that, our data warehouse must always remain trustworthy. Your skills will help us ensure data accuracy and quality and enhance our expertise in the business.

Your challenges

  • Maintain data integrity while extracting data from complex sources, both in-house and third-party. Responsible for data security, accuracy, and accessibility.
  • Establish and implement test plans, including identification, analysis, and resolving quality issues related to code and data. Ensure ETL (extract, transform, and load) processes maintain data integrity.
  • Catalogue any new and existing data sources and ETLs process.
  • Design resilient and extensible extraction pipelines.
  • Optimise and standardise existing data management workflows/processes, design new enhanced processes, and update documentation for each data solution.
  • Collaborate externally with clients to acquire data extraction access. Work internally with other departments to establish data collection and analysis procedures.


  • A minimum of 7 years of experience with data quality or data engineering field
  • Passion towards data and analysing business needs
  • Hands-on experience with one or more common data integration tools such as Airflow, DataStage, Informatica, Stitch, Talend
  • Experience troubleshooting and resolving issues in an ETL process.
  • Experience with SQL, preferably Postgres
  • Solid knowledge of testing and data quality. Data-oriented personality.
  • Familiarity with data ingestion pipeline concepts and data technologies such as OLTP databases, Data Warehousing, Data Lakes, DW/BI tools
  • Experience with Data Warehouse, Data Marts, Dimensional Modeling, Data Quality concepts, and techniques
  • Proficiency in data wrangling, data masking, self-recovery, and creating data alerts
  • Experience using APIs or Webhooks to pull data from various sources
  • Good programming ability in Python
  • Hands-on experience with Google Cloud Platform (GCP) services such as BigQuery, scheduled queries, Cloud Storage, and Cloud Functions
  • Knowledge about CI / CD principles and tools (e.g., CircleCi)
  • A strong background in data provisioning and ETL process
  • Experience in peer-reviewing pipeline codes and suggesting improvements when required
  • Fluency in spoken and written English

What’s Good To Have

  • Experience with iterative, lean, or agile development methodologies a plus
  • Knowledge of Database Administration, back-end, and Reporting tools
  • Experience in testing back-end services such as APIs, Databases, and distributed services.
  • Experience in developing cron jobs, REST APIs, Web Services, text and XML processing
  • Experience in managing stakeholders’ expectations and technical requirement gathering
  • Familiarity with container technologies such as Docker
  • A fintech background


  • Market-based salary
  • Annual performance bonus
  • Medical insurance
  • Housing and transportation allowance
  • Casual dress code
  • Work permit

Get job alerts from top tech companies in GCC