Data Engineer

January 24, 2025

Job Description

Overview

 

At Cybernetic Controls Limited (CCL), we are committed to global leadership in providing innovative digital solutions that empower businesses to reach their full potential. As a remote-first company, we believe in empowering our employees to work in a way that best suits their individual needs, fostering a culture of flexibility and trust. Since our founding in 2020, we have successfully delivered high-quality resources to our clients in the FinTech sector across various business areas. Read more on the Cybernetic Controls website.

 

Our Client:

 

Kaizen is a multi-award winning RegTech company on a mission to transform the quality of regulatory reporting in the financial services industry. We’ve combined regulatory expertise with advanced technology to develop our market-leading quality assurance services. Unique in being able to fully assess data quality, our services are used by some of the world’s largest investment banks, asset managers, hedge funds and brokers, helping them to reduce costs, improve quality and increase confidence in their regulatory reporting.

 

Job summary

 

CCL is seeking a Data Engineer to join our fast-growing team. The successful candidate will join the data engineering team at CCL to work on ETL and development tasks. This is an exciting and challenging opportunity to build new pipelines combining and processing large amounts of structured, semi-structured and unstructured data from a variety of sources.

Key Responsibilities

  • Build pipelines using AWS cloud computing solutions that make data available with robustness, maintainability, efficiency, scalability, availability and security.
  • Develop Python and PySpark code that implements complex data transformations.
  • Maintain databases and APIs for storage and transmission of data 
  • Monitor pipelines in production (and develop tools to facilitate this). 
  • Work collaboratively with other team members (brainstorming, troubleshooting, and code review). 
  • Liaise with other development teams to ensure the integrity of data pipelines.

Skills, Knowledge and Expertise

Skills:

  • Outstanding interpersonal skills
  • Excellent verbal and written communication skills
  • Fluent English 

Experience: 

  • Developing and monitoring complex production data pipelines that interact with a range of data sources (file systems, web, database, users)
  • Strong experience with Amazon Web Services (AWS)
  • At least 2 years’ work in a similar role, and at least 3 yrs of overall experience in IT

Knowledge: 

  • Data modelling, data pipeline architecture, Big Data implementation
  • Scrum/Agile best practices
  • Financial knowledge would be an asset

Qualifications/Training

  • Bachelor’s degree or equivalent in Computer Science or a related subject.

What you’ll get in return:

  • Competitive salary package 
  • Private healthcare contribution 
  • Annual pay review 
  • Regular team socials 
  • Working within a culture of innovation and collaboration 
  • Opportunity to play a key role in a pioneering growth company
  • Company Laptop will be provided