.For over a decade now, OpenNebula Systems has been leading the development of the European open source technology that helps organizations around the world to manage their corporate data centers and build their Enterprise Clouds.If you want to join an established leader in the cloud infrastructure industry and the global open source community, keep reading, because you can now join a team of exceptionally passionate and talented colleagues whose mission is to help the world's leading enterprises to implement their next-generation edge and cloud strategies. We are hiring!Since 2019, and thanks to the support from the European Commission, OpenNebula Systems has been leading the edge computing innovation in Europe, investing heavily in research and open source development, and playing a key role in strategic EU initiatives such as the IPCEI-CIS and the "European Alliance for Industrial Data, Edge and Cloud".We are currently looking for a Senior Technologist for Artificial Intelligence with expertise in LLM to come and join us in Europe as part of our new team developing the AI-enabled operations component of the next generation management platform for the Cloud-Edge Computing Continuum.Job DescriptionThe AI Operations Team is responsible for developing the AI-enabled engine to optimize operations on cloud and edge infrastructures. The engine will provide smart monitoring, intelligent workload forecasting, workload and infrastructure orchestration capabilities, and log and metric anomaly detection. The AI Team is also responsible for building new frameworks for AI on the cloud-to-edge continuum enabling different downstream applications. This covers advanced prompt engineering, prompt development, Retrieval Augmented Generation (RAG), fine-tuning, and instruction-tuning to augment LLMs.We are looking for an experienced Software Developer with strong understanding of LLMs, AI, AI Ops, and cloud best practices and deployments. The ideal candidate will be product-minded and also be experienced in Python back end developing. This role involves staying up to date with the latest research, advancements and techniques for using LLMs and generative AI in the context of Cloud-Edge Operations. AI Engineers combine both an empirical scientific approach to validating and evaluating the accuracy of these outputs, with creative problem-solving and strategy to develop better methods in this novel field of LLM prompting/retrieving, together with data engineering best practices to optimize cloud deployments and operations cost and performance.You'll work in an agile environment to design, develop, test, maintain and validate with use cases a next generation management platform for the Cloud-Edge Computing Continuum. You will also participate in the upstream community, on challenging projects developing innovative edge/cloud systems. Applicants should be passionate about the future of the software defined datacenters, distributed systems, and open source