About us
At Auctane, we are united by a passion to help sellers — wherever they are, however they operate — fulfill the promises they make to consumers.
The Auctane mission is to fuel commerce through exceptional delivery.
We make it possible for businesses to meet the ever rising expectations of their customers, and we make the world smaller and more accessible to consumers everywhere.
Auctane brands enable hundreds of thousands of merchants to annually deliver billions of products — over $200 billion worth — to customers around the globe.
And Auctane is just getting started.
Auctane is a team of shipping and software experts with a passion for helping merchants move their ideas, dreams and innovations around the globe.
The Auctane family includes ShipStation, ShipWorks, ShipEngine, ShippingEasy, Stamps, Endicia, Metapack, Shipsi, GlobalPost, and Packlink.
Our partners include Amazon, UPS, USPS, eBay, BigCommerce, Shopify, WooCommerce, and Walmart.
About the role
This is a hands-on role for a Data Engineering Technical Lead seating within the Data Products team.
In this role, you will have an enormous impact on the design, implementation and maintenance of existing and greenfield customer-facing data products.
Data is key to Auctane's strategy.
We work at a scale pace and with the latest architecture patterns and tech.
We process thousands of events per second and our massive dataset keeps growing at a staggering pace.
We keep improving our data platform and data engineering stack to accommodate growth, enable novel solutions and provide the best service to our customers.
We have a flat and open engineering culture where data & evidence beat opinion and hierarchy, backed by honest discussions.
We passionately believe in forming autonomous, cross-functional teams who are empowered to deliver our ambitious strategy.
Energy and passion for our business and customers are a part of the Auctane culture - and we love working with like-minded people.
What you'll be doing? Lead and manage a team of data engineers and contribute to the design, build and operational management of our data pipelines and analytics solution on top of proven AWS data technologies like S3, Athena, Lambda, Kinesis and ECS.Motivate and grow the team to develop their skills and increase their ownership and impact.Collaborate with the Product Owner and other stakeholders to implement the Data Products strategy.Develop frameworks and solutions that enable us to acquire, process, monitor and extract value from our massive dataset.Drive the design and architecture of Auctane's data product.Contribute directly to the implementation and operations of our systems.Support the Data Analysts and Data Scientists with automation, tooling, data pipelines and data engineering expertise, enabling them to build rich and complex analytical features and machine learning solutions for our data products.Be an advocate of data quality and observability principles and use state-of-the-art technologies like Airflow, dbt and Spark to process data and get our datasets just right.Define the roadmap of the team based on the input from the Product Owner, other leads and stakeholders.Foster engineering excellence by delivering highly reliable software and data pipelines using Software Engineering best practices like automation, version control, continuous integration/continuous delivery, testing, security, etc.Define, implement and enforce automated data security and data governance best practices within the solutions designed.Mentor more junior colleagues and coach the team on coding and data engineering best practices and patterns. What are we looking for? A strong Data Engineering background.Experience developing and supporting robust, automated and reliable data pipelines in Python and SQL.Experience with data processing frameworks like Spark, Trino (Athena), Pandas or DBT.Experience with streaming data processing AWS, Azure or Google Cloud experience.Continuous integration/delivery environment experience with a passion for automation.Knowledge of a Data Orchestration solution like Airflow, Oozie, Luigi or Prefect.Knowledge of both relational and non-relational database design.Knowledge of how to design distributed systems and the trade-offs involved.Experience working with software engineering best practices for development, including source control systems, automated deployment pipelines like Jenkins and DevOps tools like Terraform.Comfortable working within a geographically distributed team. What will make you stand out? Production experience working with very large datasets.Experience with AWS and big data cloud technologies like EMR, Athena (Trino), Glue, Big Query, Dataproc, Dataflow.Direct experience with Infrastructure as Code tools (e.g.
Terraform, CloudFormation).Knowledge and direct experience of using business intelligence and analytics.
#J-18808-Ljbffr