Introduction to Role: Join us at AstraZeneca's Alexion as a Principal Data Engineer, a key member of the Data Team. This role offers the opportunity to design and oversee the creation of automated data pipelines, implement best practices for data analytics, supervise data model solutions, and offer strategic insights based on data queries. If you are self-directed, have exceptional attention to detail, advanced computational and programming skills, and the capacity to rapidly develop and act upon in-depth knowledge of our business, products, and processes, this role is for you!
Accountabilities:Design, enhance and optimize ELT processes.Work with Data Analysts and different stakeholders to determine the data requirements of the various Alexion functions and implement design concepts to support the implementation of data product self-service solutions. Document the requirements and solution outcome.Proven ability to design, articulate, and deliver complex, large-scale data solutions that are scalable, robust, secure, and resilient.Collaborate with Data Management team in implementing and managing data strategies that optimize metadata, lineage, and quality standards in building FAIR data products.Support development and availability of information assets in the form of data services and products for internal and external consumption to ensure consistency and strategic alignment of all BI and analytics needs.Ability to understand and develop ELT processes with our set of tools, Python, and SQL queries to create aggregates/cubes/marts for data solutions, based on end user requirements.Assess new data sources to better understand availability and quality of data.Create, design, and maintain reusable datasets for AI projects.Provide governance and best practices of data structures, data integrity, and querying.Manage and continuously improve support processes and deliverables related to information delivery - working in partnership with the Managed Services team.Responsible for identifying, capturing, and managing system defects and root cause assessments.Mentor and train personnel within and outside of the information delivery group for data requests.Essential Skills/Experience:A Bachelor's Degree or Master's degree in Computer Science, Information Systems, Engineering, Business, or related field.5 years or more of experience in data engineering, business analysis, and data management.Experience designing and developing methods to consolidate and analyze structured and unstructured data.Experience developing advanced software applications, algorithms, querying and automated processes.Proficiency in Python and technologies such as dbt, Fivetran, Glue, S3, GitHub (CI/CD), Apache Airflow (MWAA), Snowflake.Solid understanding of analytic data architecture and data modelling concepts.Previous experience working with upstream source systems like Veeva CRM, Salesforce, Google Analytics, SAP.
#J-18808-Ljbffr