+44 (0)207 549 4040 Speak to a consultant now
Search jobs
+44 (0)207 549 4040 Speak to a consultant now
Ref: #55314

Data Engineer

  • Practice Data

  • Technologies Business Intelligence Jobs and Data Recruitment

  • Location Brussels, Belgium

  • Type Contract

Job description

What’s the job?

Is data technology your passion? Can you autonomously design, develop and deploy data pipelines, making sure these pipelines are filled with qualitative data which serve as an input for our machine learning engineers? Then this is the job for you.

In this role, You collaborate closely with our machine learning engineers and/or cloud architects on projects, always with a quality focused, end-to-end approach. You will support the reporting teams in the data exploration and data preparation phases, Implement data quality controls. You will liaise with IT infrastructure teams to address infrastructure issues and to ensure that the components and software used on the platform are all consistent.

 

Join Euroclear

Euroclear is a financial services company that specializes in the settlement of securities transactions, as well as the safekeeping and asset servicing of these securities. We are located in Brussels and several major cities in Europe and around the world. We are deeply convinced that diversity of talents, backgrounds and opinions is a key to success, by encouraging engagement, energy and innovation.

You will join the AIR (Analytics, Insight and Reporting tribe) in GDC division. You will join our dedicated in-house team of data-specialists using a pragmatic best-tool-for-the-job approach to optimise our hybrid infrastructure. With a strong focus on DataOps and MLOps we firmly believe in robust and production-ready solutions being an essential part to our work. The result? Our team provides an ecosystem of data-driven products to internal and external consumers

You will excel in building digital data-driven solutions and infrastructure and become an architectural genius that effortlessly designs, develops and deploys infrastructure and data pipelines . Our languages? Java,  Scala, Spark and SQL. When not coding, we speak English as we have colleagues from all around the world, you will join a truly international team.

 

Job requirements

 The Skills

  • You have experience with analysis and creation of data pipelines, data architecture, ETL/ELT development and with processing structured and unstructured data, including post-go live activities Ability to analyse data, identify issues (e.g. gaps, inconsistencies) and you can troubleshoot them
  • You have experience using data stored in RDBMSs and experience or some understanding of NoSQL databases
  • You have 3 to 5 years of experience of Scala and Spark, and a good understanding of the Hadoop ecosystem including Hadoop file formats like Parquet and ORC
  • You can write performant Scala code and SQL statements and can design  modular, future proof solutions that are fit for purpose
  • You are autonomous in working on Unix based systems.
  • You have a true agile mindset, capable and willing to take on tasks outside of her/his core competencies to help the team
  • Experience in working with customers to identify and clarify requirements
  • You have a strong interest in FINTECH and technologies related data.
  • You have strong communication skills.

 Nice to haves

  • You have experience with open source technologies used in Data Analytics like Spark, Hive, HBase, Kafka, …
  • Knowledge of Cloudera, IBM mainframe
  • You speak English fluently, any other language is a plus.

 

You are based in Belgium or are able to come to our offices in Brussels once a week.

Attach a resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!