About Swiss Re Swiss Re is one of the world's leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient.
We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime.
At Swiss Re, we combine experience with creative thinking and cutting-edge expertise to create new opportunities and solutions for our clients.
This is possible thanks to the collaboration of more than 13,000 employees across the world.
We offer a flexible working environment where curious and adaptable people thrive.
Are you interested in joining us?
About the Role We, in the Stargate Platform Engineering team, are searching for an enthusiastic Data Operations Engineer who is ready for an exciting career!
We focus on optimizing Big Data distributed systems, building and improving data pipelines while making our platform more efficient, secure and reliable.
Our team members range from PhDs in Mathematics to contributors to open-source Apache Big Data projects.
Whether you are a fresh graduate or a more experienced professional, we look forward to getting to know you.
If you join us, you will be able to participate in a variety of interesting initiatives.
For example, you will: Architect, optimize and set up terabyte scale ingestions on a day-to-day basis.
Stargate is a polyglot platform, ready to consume data from multiple sources including REST endpoints, Azure blob stores or large-scale databases.
Design and implement network tasks to handle terabytes of daily traffic in an efficient manner or configure the cloud infrastructure used in these ingestion processes.
Support many other exciting projects in a collaborative environment with colleagues from Swiss Re's international offices, building the data backbone of the company.
About the Team We are a diverse team and our passion to learn is what connects us.
We are looking for a person who loves to learn and we will support you in this journey!
We reserve time for studying, getting certificates, attending conferences, and we organize our own upskilling sessions as well.
In addition to projects for our clients, we also work on innovation projects which allow us to learn new methodologies and gain fresh experience.
About You Nobody is perfect and meets 100% of requirements.
If you however meet some of the criteria below and are genuinely curious about the world of data science, we will be happy to meet you.
In order to be successful in the role, you need to have these technical skills and knowledge: Degree in Computer Science, Computer/Electronics Engineering, Applied Mathematics, Physics or related quantitative field.
We welcome different levels, including graduates.
Knowledge of Python programming language and bash scripting.
Knowledge of Linux administration, infrastructure and network related topics.
Knowledge of DevOps principles.
If you also have the following experience and interests, it is a huge plus: Experience in databases optimization (e.g.
load analysis, query planning).
Interest and willingness to learn high-performance computing/Big Data/algorithms/more technical Python.
AWS/Azure/GCP certificates.
We are an equal opportunity employer, and we value diversity at our company.
Our aim is to live visible and invisible diversity - diversity of age, race, ethnicity, nationality, gender, gender identity, sexual orientation, religious beliefs, physical abilities, personalities and experiences - at all levels and in all functions and regions.
We also collaborate in a flexible working environment, providing you with a compelling degree of autonomy to decide how, when and where to carry out your tasks.
#J-18808-Ljbffr