Azure Dataops Engineer - Data Distribution & Ci/Cd -

Detalles de la oferta

We are a dynamic and forward-thinking team focused on delivering scalable and innovative Big Data solutions to empower organizations with actionable insights. Leveraging advanced cloud technologies, we provide a robust platform for data management and analytics that drives business growth and informed decision-making. Our team specializes in developing and maintaining a state-of-the-art Big Data platform built on Azure PaaS components. We work closely with stakeholders to understand their needs and ensure the platform's scalability, reliability, and security.
About the RoleWe are looking for a detail-oriented and highly motivated professional to join our team as a DataOps Engineer. In this role, you will design and optimize data distribution strategies using cutting-edge tools such as Azure Databricks and Unity Catalog. You'll also take ownership of CI/CD pipelines and contribute to maintaining a seamless and efficient data infrastructure.
Key ResponsibilitiesDesign, implement, and maintain data distribution solutions with Azure Databricks, Unity Catalog, and Azure Data Factory.Monitor and optimize data pipelines for performance, accuracy, and scalability.Collaborate with data science teams to address their requirements and develop tailored solutions.Develop, refine, and maintain CI/CD processes for the data distribution pipeline.Troubleshoot and resolve data-related issues, ensuring data quality and reliability.Create and update technical documentation related to data distribution workflows.Evaluate and test new features in tools like Azure Databricks and Azure Data Factory, recommending their adoption when beneficial.Your ProfileBachelor's degree in Computer Science, Information Technology, Engineering, or a related field.3-4 years of hands-on experience in a DataOps or related role, using tools like Azure Data Factory, Databricks, and Python.Proficiency in automation and scripting, with experience in Azure DevOps pipelines.Solid understanding of Big Data concepts and strong software engineering skills for pipeline maintenance and optimization.Nice to have familiarity with Unity Catalog for efficient data distribution.Fluent English communication skills, both written and spoken.Nice to HaveExperience with event-driven architectures (e.g., Kafka, RabbitMQ).Hands-on experience with infrastructure as code tools (e.g., Terraform).Familiarity with Data Catalog solutions and Big Data integration.Relevant certifications from Microsoft or similar vendors.Strong team collaboration and communication skills.If you are interested in this challenging position we are looking forward to receiving your comprehensive application for ref.no. 105,013 preferably through our ISG career portal or via email.

#J-18808-Ljbffr


Salario Nominal: A convenir

Fuente: Jobleads

Requisitos

Consultor Dynamics

Desde HAYS estamos colaborando con una compañía pionera en la importación y comercialización de componentes para vehículos industriales. Desde sus inicios, b...


Hays - Madrid

Publicado 10 days ago

Prácticas Remuneradas Community Manager, Contenido

Te gustan las Redes Sociales? ¿Te crispa ver en una publicación de Instagram una web donde no se puede clicar? ¿Estás al tanto de todas las tendencias que su...


Intimina - Madrid

Publicado 10 days ago

Informático De Seguridad (Normativa Dora)/ Híbrido

Sistemas- TECNICO Sistemas- hace 3 horas**Descripción**: - En - **Krell-Consulting**, estamos en la búsqueda de un profesional informático con experiência e...


Krell Consulting - Madrid

Publicado 10 days ago

Rmt Ap Powercenter + Teradata

Descripción **Analista Programador PowerCenter + Teradata**: ¿Estás buscando un nuevo reto en el sector bancario? ¡Entonces esta oferta es para ti! En **Z...


Zemsania - Madrid

Publicado 10 days ago

Built at: 2024-11-26T11:45:46.195Z