As a Senior Data Engineer, you will play a pivotal role in designing, implementing, and optimising robust data solutions to support business-critical analytics and decision-making processes. This role requires expertise in building and managing data pipelines, working with data warehouse platforms like Redshift and Oracle, and delivering end-to-end data engineering projects. You will collaborate with cross-functional teams to ensure seamless integration of data solutions, delivering high-quality analytics and reporting capabilities while leveraging modern DevOps practices. Your contributions will directly influence Kindred's ability to harness data for strategic insights and innovation. What you will do
Enhance Data Warehousing and Reporting: Develop, optimise, and maintain stored procedures, functions, and queries for both Redshift and Oracle databases. Design and implement robust data warehouse architectures to support analytical and operational reporting needs. Create advanced reporting solutions using Power BI, AWS QuickSight, and other visualisation tools to deliver actionable insights. Build and Optimise Data Pipelines: Design, develop, and maintain scalable data pipelines using Java Spring Boot, Python, and orchestration tools like Apache Airflow. Implement and optimise ETL/ELT processes for efficient handling of large-scale data flows. Leverage Kafka, Redshift, and Oracle data warehouses to build performant data models and pipelines. Cloud Infrastructure and DevOps: Architect and manage data solutions on AWS, including services like Redshift, S3, EC2, and Lambda. Collaborate with DevOps teams to implement CI/CD pipelines, containerised environments using Docker and Kubernetes, and infrastructure-as-code tools like Terraform. End-to-End Delivery: Lead and manage end-to-end project lifecycles, from gathering requirements to deployment and post-production support. Ensure alignment with business objectives and timely delivery of data solutions by collaborating closely with Product Owners and Engagement Managers. Collaboration and Stakeholder Engagement: Work with cross-functional teams to translate business requirements into technical solutions. Act as a bridge between data engineering and business teams to ensure data solutions meet stakeholder needs and align with strategic goals. Innovation and Continuous Learning: Stay updated on emerging technologies in data engineering, cloud computing, and DevOps to future-proof our data solutions. Share knowledge through mentoring, documentation, and internal training sessions to foster a learning culture. Compliance & Training: Ensure that you adhere to the Governance, Risk & Compliance (GRC) obligations for your role. Identify and raise any non-compliance incidents promptly to your line manager. Challenge processes, policies and projects that will negatively impact compliance within the Group. Complete all mandatory compliance training assigned to you. Reach out to the Compliance Teams if unsure of any of your compliance obligations or the requirements are unclear. Your experience
Core Technical Skills: Data Warehousing: Experience with Redshift and Oracle, including developing and optimising stored procedures, complex queries, and data models. Programming Languages: Strong proficiency in Java (Spring Boot) and Python for data engineering and API development. Data Pipeline Expertise: Hands-on experience building scalable ETL/ELT pipelines using frameworks like Apache Airflow. Cloud Proficiency: Proven expertise with AWS services, including Redshift, S3, EC2, Lambda, and Kafka. Reporting Tools: Advanced skills in Power BI, including DAX, Power Query, and creating interactive dashboards for complex datasets. DevOps and Automation: Experience with containerisation (Docker) and orchestration (Kubernetes). Proficiency in CI/CD tools such as Jenkins, GitHub Actions, or GitLab CI/CD. Familiarity with infrastructure-as-code tools like Terraform or Ansible. End-to-End Delivery: Demonstrated ability to manage end-to-end delivery of data projects, from requirements gathering to deployment and ongoing support. Proven track record of delivering projects on time while meeting stakeholder expectations. Soft Skills: Excellent communication and collaboration skills to engage effectively with technical and non-technical stakeholders. Strong problem-solving skills with a proactive and detail-oriented mindset. Passion for mentoring and creating a culture of continuous improvement within the team. Consulting/Professional Services Background: Proven ability to work in a consulting or professional services role, building trust with stakeholders, and effectively communicating complex solutions. Desired Qualities: Experience with other programming languages like Scala or .NET. Familiarity with Power BI Fabric technologies. Knowledge of data governance and compliance best practices.
#J-18808-Ljbffr