Join our dynamic team as a Data Engineer, where you will have the opportunity to engage in a broad range of impactful responsibilities:
Developing the Core Data Platform : Build and enhance our data platform using AWS and Databricks, including writing Terraform and deploying robust infrastructure.
Collaborating Across Teams : Work closely with Product, Marketing, and other departments to enable data-driven decision-making and deliver insights and tools that drive innovation.
Designing and Implementing Pipelines : Create real-time data pipelines, batch ETL processes, and uphold high standards for data integrity, quality, and governance.
Managing the Data Warehouse : Maintain and optimize our data warehouse, focusing on best practices for performance, scalability, and disaster recovery.
Desired Technical Skills :
Proficiency in
Python
and
Spark .
Hands-on experience with
AWS Cloud
(preferred).
Familiarity with
Infrastructure as Code (IaC)
tools such as Terraform.
Strong knowledge of
SQL
and
NoSQL
databases.
Proven experience in building data warehouses and pipelines.
Expertise in data platform solutions such as
Databricks
and/or
Snowflake .
Experience with
containerization tools
like Docker (ECS/Kubernetes - EKS) and orchestration tools (e.g., AWS Step Functions, Airflow, Dagster).
Nice to have:
Exposure to
Change Data Capture (CDC) pipelines, preferably
AWS DMS.
Understanding of
event streaming architectures .
This role offers an exciting opportunity to shape the future of our data landscape while working with cutting-edge technologies.
#J-18808-Ljbffr