Job Description

  • Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities 
  • Gather requirements and business process knowledge in order to transform the data in a way that’s geared towards the needs of end users 
  • Maintain and improve already existing processes 
  • Ensure that the data architecture is scalable and maintainable 
  • Work with the business in designing and delivering correct, high quality data 
  • Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions 
  • Prepare documentation for further reference
  • SQL knowledge (query performance tuning, index maintenance, etc.) as well as an understanding of database structure 
  • Knowledge of at least one ETL tool – Pentaho 
  • Knowledge of data modelling principles 
  • Experienced in the integration between MS SQL Server databases and PostgreSQL databases using Pentaho ETL / SSIS tools 
  • Knowledge of various SQL data storage mechanisms and Big Data technologies 
  • High attention to detail 
  • Organizational skills: time management and planning 
  • Passionate about complex data structures and problem solving 
  • Ability to pick up new data tools and concepts quickly

Key skill Required

  • SQL (e.g. ETL)
  • ETL
  • MS SQL Server


  • Software Developer