[E218] | Senior Data Engineer

Detalles de la oferta

We are currently seeking a Senior Data Engineer with 5-7 years' experience. The ideal candidate would have the ability to work independently within an AGILE working environment and have experience working with cloud infrastructure leveraging tools such as Apache Airflow, Databricks, DBT, and Snowflake. A familiarity with real-time data processing and AI implementation is advantageous.

Responsibilities: Design, build, and maintain scalable and robust data pipelines to support analytics and machine learning models, ensuring high data quality and reliability for both batch & real-time use cases.Design, maintain, and optimize data models and data structures in tooling such as Snowflake and Databricks.Leverage Databricks for big data processing, ensuring efficient management of Spark jobs and seamless integration with other data services.Utilize PySpark and/or Ray to build and scale distributed computing tasks, enhancing the performance of machine learning model training and inference processes.Monitor, troubleshoot, and resolve issues within data pipelines and infrastructure, implementing best practices for data engineering and continuous improvement.Diagrammatically document data engineering workflows.Collaborate with other Data Engineers, Product Owners, Software Developers, and Machine Learning Engineers to implement new product features by understanding their needs and delivering timely solutions. Qualifications: Minimum of 5 years experience deploying enterprise-level scalable data engineering solutions.Strong examples of independently developed data pipelines end-to-end, from problem formulation, raw data, to implementation, optimization, and result.Proven track record of building and managing scalable cloud-based infrastructure on AWS (including S3, Dynamo DB, EMR).Proven track record of implementing and managing AI model lifecycle in a production environment.Experience using Apache Airflow (or equivalent), Snowflake, and Lucene-based search engines.Experience with Databricks (Delta format, Unity Catalog).Advanced SQL and Python knowledge with associated coding experience.Strong experience with DevOps practices for continuous integration and continuous delivery (CI/CD).Experience wrangling structured & unstructured file formats (Parquet, CSV, JSON).Understanding and implementation of best practices within ETL and ELT processes.Data Quality best practice implementation using Great Expectations.Real-time data processing experience using Apache Kafka (or equivalent) will be advantageous.Work independently with minimal supervision.Takes initiative and is action-focused.Mentor and share knowledge with junior team members.Collaborative with a strong ability to work in cross-functional teams.Excellent communication skills with the ability to communicate with stakeholders across varying interest groups.Fluency in spoken and written English. #LI-RT9 Edelman Data & Intelligence (DXI) is a global, multidisciplinary research, analytics, and data consultancy with a distinctly human mission. We use data and intelligence to help businesses and organizations build trusting relationships with people: making communications more authentic, engagement more exciting, and connections more meaningful.
DXI brings together and integrates the necessary people-based PR, communications, social, research, and exogenous data, as well as the technology infrastructure to create, collect, store, and manage first-party data and identity resolution. DXI is comprised of over 350 research specialists, business scientists, data engineers, behavioral and machine-learning experts, and data strategy consultants based in 15 markets around the world.
To learn more, visit: [Company Website]
We are dedicated to building a diverse, inclusive, and authentic workplace, so if you're excited about this role but your experience doesn't perfectly align with every qualification, we encourage you to apply anyway. You may be just the right candidate for this or other roles.

#J-18808-Ljbffr


Salario Nominal: A convenir

Fuente: Jobleads

Requisitos

Desarrollador/A Software (Kong, Apis), 100% En Remoto

Desarrollador/a Software (KONG, APIs) En SEREM estamos comprometidos con diversos proyectos y queremos contar con los mejores profesionales del sector. Nos ...


Serem - Madrid

Publicado 13 days ago

Responsable Cad / Cam, Bizkaia

Responsable CAD / CAM Perfil buscado (Hombre/Mujer) Será el/la encargado/a de Diseñar gestionar sus trabajos y comunicar tanto con cliente interno como ...


Michael Page - Madrid

Publicado 13 days ago

Desarrollador/A Fullstack, 100% En Remoto

Desarrollador/a Fullstack Descripción En Krell Consulting buscamos un/a Desarrollador/a Fullstack para un proyecto innovador. Si tienes experiencia tanto en...


Krell Consulting & Training - Madrid

Publicado 13 days ago

Administrador/A Terraform, Madrid

Administrador/a Terraform Si crees que la ciberseguridad va más allá del pentesting... ¡Tu sitio está con nosotros!Estamos ampliando nuestros equipos en el s...


GMV - Madrid

Publicado 13 days ago

Built at: 2024-12-26T17:00:38.159Z