JOB OVERVIEW
As a Senior Data Engineer, you will play a critical role in helping the organization leverage data and technology to drive business growth, customer engagement, and data-driven decision-making. The primary responsibilities will be building, managing, and optimizing data pipelines and then moving these data pipelines effectively into production for critical data and analytics consumers like business/data analysts, data scientists, or any persona needing curated data for data and analytics use cases across the enterprise.
ROLES AND RESPONSIBILITIES
- Design, build, and maintain our data architecture using technologies like Snowflake, Azure, and Databricks.
- Develop and maintain ELT/ETL workflows using Python, SQL, and Spark to ensure data accuracy, completeness, and consistency.
- Collaborate with cross-functional teams to gather and analyze business requirements and recommend solutions.
- Ensure compliance with data privacy regulations and best practices.
- Implement and manage DevOps processes to automate the deployment and testing of data pipelines and workflows.
- Stay current with the latest data technologies, trends, and best practices, and recommend adoption where appropriate.
- Work in an Agile environment to deliver high-quality solutions on time and within budget.
TECHNICAL COMPETENCIES (Knowledge, Skills & Abilities)
- Strong experience creating pipelines in Snowflake.
- Knowledge of the MarTech ecosystem.
- Comfort manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources.
- Fluency with at least one scripting language, such as Python and R.
- Strong knowledge of DevOps workflow, cloud-native platforms (containers, Kubernetes, serverless, etc.) and tools (such as Git), and infrastructure as code (IaC) tools (such as CloudFormation or Terraform)
- Good written and oral communication skills.
- Highly initiative-taking and directed.
EDUCATION AND EXPERIENCE
- Bachelor’s degree in computer science, mathematics, or statistics
- Minimum five years equivalent work experience.
- Minimum three years of cloud platform experience
- Minimum three years of experience with Snowflake, Azure, and visualization tools.
- Experience with Data Integration tools like Spark/Databricks or equivalent
- Experience in Linux/Unix shell scripting
If your experience is close to what we’re looking for, please consider applying. Experience comes in many forms, skills are transferable, and passion goes a long way. We know that diverse backgrounds and experiences make for the best problem-solving and creative thinking, which is why we’re dedicated to adding new perspectives to the team and encourage everyone to apply. We look forward to learning more about you.
#LI-DG1