Job Description
Overview
Â
At Cybernetic Controls Limited (CCL), we are committed to global leadership in providing innovative digital solutions that empower businesses to reach their full potential. As a remote-first company, we believe in empowering our employees to work in a way that best suits their individual needs, fostering a culture of flexibility and trust. Since our founding in 2020, we have successfully delivered high-quality resources to our clients in the FinTech sector across various business areas. Read more on the Cybernetic Controls website.
Â
Our Client:
Â
Kaizen is a multi-award winning RegTech company on a mission to transform the quality of regulatory reporting in the financial services industry. We’ve combined regulatory expertise with advanced technology to develop our market-leading quality assurance services. Unique in being able to fully assess data quality, our services are used by some of the world’s largest investment banks, asset managers, hedge funds and brokers, helping them to reduce costs, improve quality and increase confidence in their regulatory reporting.
Â
Job summary
Â
CCL is seeking a Data Engineer to join our fast-growing team. The successful candidate will join the data engineering team at CCL to work on ETL and development tasks. This is an exciting and challenging opportunity to build new pipelines combining and processing large amounts of structured, semi-structured and unstructured data from a variety of sources.
Key Responsibilities
- Build pipelines using AWS cloud computing solutions that make data available with robustness, maintainability, efficiency, scalability, availability and security.
- Develop Python and PySpark code that implements complex data transformations.
- Maintain databases and APIs for storage and transmission of dataÂ
- Monitor pipelines in production (and develop tools to facilitate this).Â
- Work collaboratively with other team members (brainstorming, troubleshooting, and code review).Â
- Liaise with other development teams to ensure the integrity of data pipelines.
Skills, Knowledge and Expertise
Skills:
- Outstanding interpersonal skills
- Excellent verbal and written communication skills
- Fluent EnglishÂ
Experience:Â
- Developing and monitoring complex production data pipelines that interact with a range of data sources (file systems, web, database, users)
- Strong experience with Amazon Web Services (AWS)
- At least 2 years’ work in a similar role, and at least 3 yrs of overall experience in IT
Knowledge:Â
- Data modelling, data pipeline architecture, Big Data implementation
- Scrum/Agile best practices
- Financial knowledge would be an asset
Qualifications/Training
- Bachelor’s degree or equivalent in Computer Science or a related subject.
What you’ll get in return:
- Competitive salary packageÂ
- Private healthcare contributionÂ
- Annual pay reviewÂ
- Regular team socialsÂ
- Working within a culture of innovation and collaborationÂ
- Opportunity to play a key role in a pioneering growth company
- Company Laptop will be provided