Lo sentimos, la oferta no está disponible,
pero puedes realizar una nueva búsqueda o explorar ofertas similares:

Frontend Developer (X|F|M) - Hybrid

We are looking for a Frontend Developer (x|f|m) for our Digital Customer Experience department. In your position, you will contribute to the agile software d...


Desde Sartorius - Madrid

Publicado a month ago

Reparación De Baldosas - Contratista Para La Reparación De Baldosas (Temporal)

Ubicación del azulejo: Cocina; Qué necesita reparación: Pared; Problema de azulejo para reparar: Derrumbe, manchado o lechada desaparecida; Superficie cuadra...


Desde Fixando - Madrid

Publicado a month ago

Global Salesforce Analyst C4B Madrid

¿Quieres cambiar el mundo? Eso es lo que hacemos en Cabify. Nuestro objetivo es hacer de las ciudades mejores lugares para vivir, mejorando la movilidad de l...


Desde Cabify - Madrid

Publicado a month ago

Arquitectos De Soluciones Cloud - Qbx117

En NEORIS es un acelerador Digital que ayuda a las compañías a entrar en el futuro, teniendo 20 años de experiencia como Socios Digitales de algunas de las m...


Desde Neoris - Madrid

Publicado a month ago

Data Engineer - Platform

Detalles de la oferta

Ebury is a hyper-growth FinTech firm, named in 2021 as one of the top 15 European Fintechs to work for by AltFi. We offer a range of products including FX risk management, trade finance, currency accounts, international payments and API integration.
Data Engineer - Data Platform Engineer
Madrid - 4 days in the office & 1 day working from home
About our company:
Ebury is a FinTech success story, positioned among the fastest-growing international companies in its sector. Headquartered in London, we have more than 1,500 staff covering over 50 nationalities (and counting!) working across more than 27 offices worldwide and serving more than 45,000 clients every day.
About our team:
Ebury's strategic growth plan would not be possible without our Data team and we are seeking a Data Engineer to join our Data Engineering team!
Our data mission is to develop and maintain Ebury's Data Platform and serve it to the whole company, where Data Scientists, Data Engineers, Analytics Engineers and Data Analysts work collaboratively to:

Build ETLs and data pipelines to serve data in our platform
Provide clean, transformed data ready for analysis and used by our BI tool
Develop department and project specific data models and serve these to teams across the company to drive decision making
Automate end solutions so we can all spend time on high-value analysis rather than running data extracts

About our technology and Data stack:

Google Cloud Platform as our main Cloud provider
Apache Airflow and dbt Cloud as orchestration tools
Docker as PaaS to deliver software in containers
Cloud Build as CICD
dbt as data modelling and warehousing
Looker and Looker Studio as Business Intelligence/dashboarding
Github as code management tool
Jira as project management tool

Among others third party tools such as: Hevodata, MonteCarlo, Synq…
About the role:
As a Data Engineer, you will work closely with the rest of the team to help model and maintain the Data Platform. Therefore, we are looking for:

At least 3+ years of data/analytics engineering experience building, maintaining & optimising data pipelines & ETL processes on big data environments
Proficiency in Python and SQL
Knowledge of software engineering practices in data (SDLC, RFC…)
Stay informed about the latest developments and industry standards in Data
Fluency in English
As a plus:

Experience with our modern Data stack tools
Dimensional modelling/data warehousing concepts knowledge
Spanish language


Why this offer is for you:
You will:

Be mentored by one of our outstanding performance team member along a 30/60/90 plan designed only for you
Participate in data modelling reviews and discussions to validate the model's accuracy, completeness, and alignment with business objectives.
Design, develop, deploy and maintain ELT/ETL data pipelines from a variety of data sources (transactional databases, REST APIs, file-based endpoints).
Serve hands-on delivery of data models using solid software engineering practices (eg. version control, testing, CI/CD)
Manage overall pipeline orchestration using Airflow (hosted in Cloud Composer), as well as execution using GCP hosted services such as Container Registry, Artifact Registry, Cloud Run, Cloud Functions, and GKE.
Work on reducing technical debt by addressing code that is outdated, inefficient, or no longer aligned with best practices or business needs.
Collaborate with team members to reinforce best practices across the platform, encouraging a shared commitment to quality.
Help to implement data governance policies, including data quality standards, data access control, and data classification.
Identify opportunities to optimise and refine existing processes.


#J-18808-Ljbffr


Salario Nominal: A convenir

Fuente: Whatjobs_Ppc

Requisitos

Built at: 2024-11-08T06:33:21.692Z