.People are at the center of everything we do at TK Elevator – we work as one team, watch out for each other's safety and share a common vision to keep the world moving. This is your chance to go beyond and impact the future of urban mobility. We are looking for talented people who want to start or develop their professional career with us. With integrated cloud-based solutions such as our MAX platform, we embrace the full potential of digitalization, transforming ourselves into a digitally augmented company to make life easier, more efficient and more comfortable. Key Responsibilities: Design and Develop Data Architectures: Architect and implement scalable, high-performance data storage solutions (data warehouses, data lakehouses). Develop and maintain data models and database designs that meet business and technical requirements. Create and optimize data pipelines and workflows for efficient data processing. Implement Data Solutions: Build robust ETL/ELT processes to extract, transform, and load data from various sources. Ensure data architectures support advanced analytics, reporting, and business intelligence needs. Integrate structured and unstructured data sources to enable comprehensive data analysis. Technical Leadership: Provide expertise in data architecture to guide development teams and ensure best practices. Lead technical projects related to data infrastructure and systems. Collaborate with data engineers and developers to implement data solutions effectively. Ensure Data Quality and Security: Implement data governance policies and data quality standards. Monitor data systems for performance, reliability, and security compliance. Apply data security measures to protect sensitive information and ensure regulatory compliance. Stay Current with Emerging Technologies: Research and evaluate new technologies and tools to enhance data architecture. Incorporate industry best practices and innovative solutions into TKE's data strategy. Optimize System Performance: Tune databases and data processing systems for optimal performance. Resolve technical issues related to data architecture and improve system scalability. Minimum Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related technical field. Experience in data architecture, data engineering, or a similar technical role. Proficient in SQL and programming languages such as Python. Hands-on experience with big data technologies like Apache Spark. Strong knowledge of Azure cloud services, including Databricks, Azure Data Factory, Azure Synapse Analytics, and Azure Analysis Services. Familiarity with BI tools like Power BI (preferred), Tableau, or Qlik. Deep understanding of normalized and denormalized data structures, relational and dimensional data models. Experience with data warehouse concepts, data mesh, medallion/lakehouse architectures, and ETL/ELT processes