Design, build, and maintain scalable data solutions using AWS services, ensuring peak performance and reliability. Data Management:
Expertly manage data storage, retrieval, and processing across AWS S3, Redshift, and other AWS tools. Innovation:
Lead the charge in integrating the latest AWS technologies and best practices into our data engineering processes. Collaboration:
Work closely with data architects and lead other engineers to deliver top-tier data solutions. Optimization:
Continuously refine data workflows and processes to achieve unparalleled efficiency and minimal latency. Security:
Implement and uphold robust data security protocols to safeguard our most valuable asset—information. Specification AWS Expert:
Deep expertise with AWS services, particularly S3, Redshift, Lambda, Glue, and RDS. Data Engineering:
Exceptional skills in data engineering, ETL processes, and data warehousing. DBT:
Although not an AWS technology, it’s important to highlight its relevance. Kinesis:
Essential for near real-time data streaming. PySpark:
A fundamental component for ETL processes. While it's covered under Glue, it's worth explicitly calling out. Experience:
Proven excellence in engineering data solutions, with at least 3 years of experience in the AWS environment. Problem-Solving:
Superior analytical and problem-solving abilities, adept at resolving complex data challenges. Communication:
Outstanding communication skills, capable of translating technical concepts for non-technical stakeholders. Agility:
Thrive in a high-octane, dynamic environment, adapting swiftly to new challenges. Education:
Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. Certification:
AWS Certified Solutions Architect or equivalent certification is highly desirable. Be the force behind Formula 1! Division: Technical For more detail, salary and company information, use the apply link.
#J-18808-Ljbffr