.Senior Data Integration Specialist (global report) Guardar For one of our best clients in the investment sector, we are in search of the following profile to participate in the opening of a new Digital Hub in Madrid. Position Summary: As a Data Integration Specialist, you will be responsible for the design, development, implementation, and management of data integration solutions across various sources and systems within our Asset Management organization. You will play a critical role in ensuring that our data architecture supports our strategic objectives, facilitating effective data analysis and reporting. This position requires a blend of technical expertise, collaboration skills, and a deep understanding of data governance practices. Key Responsibilities:
- Design, develop and maintain scalable and efficient data integration processes. Design ETL (extract, transform, load) pipelines that ensure the smooth flow of data from various sources into numerous target system, data warehouses or data lakes.
- Support the data quality manager in designing and implementing data quality checks and balances to ensure high data accuracy and integrity. Collaborate with the data governance team to adhere to and promote data governance standards and policies.
- Work closely with business analysts, IT teams, and other stakeholders to understand their data needs and ensure the data integration patterns are defined correctly to meet these requirements. Translate technical details into business terms and vice versa.
- Monitor data integration pipelines for issues, perform root cause analysis, and implement fixes in a timely manner. Optimize and update data integration processes as needed.
- Stay abreast of the latest trends and technologies in data integration and recommend improvements to tools, processes, and technologies to enhance data management practices.
- Document data integration processes, systems, and standards. Provide training and support to team members and stakeholders on data integration tools and best practices. Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or related field.
- 3+ years of experience in data integration, ETL/ELT development, or a similar role.
- Strong proficiency in Python, SQL and experience with ETL/ELT tools.
- Familiarity with data warehousing and data modeling concepts.
- Experience with Databricks and cloud-based data integration services (e.G. Azure Data Factory). Nice to have experience with DQ Tools (e.G. Collibra).
- Experience with implementing both event-based and scheduled data flows and pipelines.
- Familiarity with APIs (e.G. REST)
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- A proactive approach to learning new technologies and methodologies