You own the 'T' in ELT, focusing on delivering clean, transformed data that powers business analytics. Engage closely with stakeholders to meet their data needs and ensure the reliability and timeliness of data delivery. Platform Ownership:
Manage and govern our primary data platform, Snowflake. Ensure secure data access and compliance, maintaining high standards of data protection and accessibility. Data Integrity:
Develop and enforce data quality checks, build comprehensive data dictionaries, and conduct tests to guarantee the reliability of our data architecture. Collaboration and Communication:
Work within an agile team environment, liaising between business stakeholders, engineers, and analysts to translate business needs into effective data models. Your ability to communicate complex technical concepts to non-technical team members is vital. You'll be a great addition to the team if you: Advanced SQL Skills:
Exceptional ability in SQL is required for effective data manipulation, analysis, and crafting of complex queries. dbt Expertise:
Extensive experience with dbt is crucial for designing and maintaining robust data models and transformation processes that are efficient and scalable. Big Data Systems Proficiency:
Knowledge of Snowflake is recommended but experience with other big data systems like BigQuery will be considered. We’re looking for a well-rounded skill set in managing large data environments. Python Skills:
Familiarity with writing clean and efficient code for data manipulation, automation, and integration tasks is beneficial. Agile and Adaptive:
Comfortable working in an agile environment, you should be adept at balancing technical precision with timely project delivery, demonstrating flexibility and effectiveness in a dynamic setting. Technical Curiosity:
Demonstrate a keen interest in continuously enhancing your capabilities in data technologies. This includes potential integration of tools like Metaplane for advanced data flow monitoring and performance optimisations. Extra Bonus Points if you: Experience with Data Monitoring Tools:
Familiarity with Metaplane or similar tools for monitoring data flows. Proficiency in Version Control Systems:
Notably git, which is crucial for managing and tracking changes in the codebase effectively. Experience with Data Orchestration and Ingestion Frameworks:
Proficient with frameworks like Fivetran, Airflow, and Dagster, important for automating, managing, and orchestrating data pipelines efficiently. Knowledge of Data Visualisation Tools:
Experience with tools such as Tableau, which are essential for creating impactful data visualisations and dashboards. Our Data Tech Environment Snowflake, DBT, SQL, Python, Fivetran, Airflow, Dagster, Metaplane, Tableau. AWS (inc Sagemaker, EC2, Lambda, Glue, S3, API gateway), Terraform, C#, .NET Core. GitHub for SCM, CI/CD through GitHub workflows. Google Analytics, GTM, GCP Big Query. Robust and performant cloud/serverless applications. We don’t expect you to have experience with all of the technologies above! How we get there Sprints Jira / Confluence Pair Programming Focus on experimentation to validate our hypothesis Want to hear more? Find out more about Moonpig Group and what it has to offer
here
! Moonpig’s Commitment to Equality, Diversity and Inclusivity At Moonpig Group, we’re committed to creating an inclusive and caring culture with brilliant people who feel a real sense of belonging. We welcome and celebrate all diverse backgrounds to Moonpig Group, from working parents who need flexibility with their hours to individuals who are neurodiverse and prefer to work a certain way. We’re proud to have several employee-led committees within our organisation, including the LGBTQ+, Gender Balance, Neurodiversity and our EMBRACE (Educating Myself for Better Racial Awareness and Cultural Enrichment) Committees. We’ll continue to push for diversity and that sense of belonging so that all Moonpig Group employees feel safe and comfortable to be their true authentic self at work.
#J-18808-Ljbffr