There will never be a typical day at Accenture Digital, but that's why people love it here. The opportunities to make a difference while working on exciting client initiatives are limitless in this ever-changing space. Our clients operate at scale in international markets - you'll be working with famous brands and household names - no worrying about how to explain what you do to your family again!We are looking for a Data Engineer to join us. The hire will be accountable for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground.Responsibilities for Data EngineerCreate and maintain optimal data pipeline and ETL process.Assemble large, complex data sets that meet functional and non-functional business requirements.Identify, design, and implement internal process improvements automating manual processes, optimizing data delivery, redesigning infrastructure for greater scalability, etc.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs.Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.QualificationsWe are looking for a candidate with 3 years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software tools:Experience with relational SQL and NoSQL databases.Experience with data pipeline and workflow management tools such as Airflow, Terraform, Composer, etc.Experience with cloud services such as Google Cloud Storage, BigQuery.Experience using Python and Java for data transformation.Experience building and optimizing big data pipelines, architectures, and data sets.Build processes supporting data transformation, data structures, metadata, dependency, and workload management.Experience supporting and working with cross-functional teams in a dynamic environment.
#J-18808-Ljbffr