NDT Global Barcelona, Barcelona, Barcelona, SpainJob DescriptionPosted Friday, May 17, 2024 at 4:00 AMNDT Global is the leading provider of ultra-high-tech diagnostic inspection solutions, advanced data analysis and integrity assessment services for ensuring the safety and longevity of energy-sector infrastructure assets. Recognized as the forerunner in ultrasonic inspection technologies comprising Pulse Echo, Pitch-and-Catch and Phased Array, as well as Acoustic Resonance (ART Scan) methodologies, the company also deploys a range of non-ultrasonic technologies, such as Inertial Measurement Units, with more under development. NDT Global strategically applies its inspection technologies to detect, diagnose and model various types of threat—circumferential or axial cracks, metal loss, geometry, mapping, and more—across diverse classes of assets. By providing predictive, decision-ready insights driven by the world's most accurate data, NDT Global enables the conditions for asset owners to optimize infrastructure health and drive operational efficiencies while reducing risk and minimizing their carbon footprint.PurposeThe goal of the Data Engineer is to architect, specify and implement data access, ingestion, processing, and provisioning of large-scale datasets into the Big Data infrastructure. The role ensures that architectural, test driven, and clean code principles are followed. This position also covers the design, setup, maintenance, and upgrade of the production environment.TasksDesigns, specifies, and writes software code for different projects in Big Data environmentsMentors, supports, and coaches colleagues, Data Scientists and Engineers in Big Data and Data Engineering topicsMaintains software code and infrastructure in the Big Data environmentsDesigns and prepares Big Data platform for machine learning and data processing in production environmentsDocuments the work for the developed solutions and maintains consistency of informationResponsible for the overall software and code base and implements the software code into productionPerforms code reviews and ensures quality of the developed software in production stageRequirementsA bachelor's degree in computer science, Engineering or related education is requiredA minimum of 6 years of experience in Software Development and Big Data frameworkA minimum of 3 years of experience working with Big Data framework is requiredAdvanced experience in Scala, Apache Spark, Kafka and distributed storage systems (S3, Ceph)Experience with NoSQL Databases (Redis) and SQL Databases (MySQL, Microsoft SQL Server)Familiarity with cloud platforms (AWS or Azure) and their big data services (e.g., EMR, Databricks)Experience with Apache Nifi is a plusExperience with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plusSkillsetMust be proficient in spoken and written EnglishKnowledge of advanced programming skills (functional or object-oriented)Knowledge of Machine Learning is preferredStrong background in big data processing, data pipeline development, and performance optimizationExcellent problem-solving skills and the ability to troubleshoot complex data issuesStrong communication skills and the ability to work collaboratively in a team environmentBenefitsCompetitive salary and comprehensive benefits package: Health Insurance, pension Plan, Gym Flex, eyecare, 23 days Annual leaveFlexible & Hybrid working: Our flextime model allows you to design your working day as it suits your needsWe think about the future: Individual training and development opportunities and professional development schemesOpportunity to work with cutting-edge technologies and make a significant impact on our data strategyInternational, intercultural and young working environmentGreat company culture and office environment
#J-18808-Ljbffr