• Bachelor's degree in Computer Science, Engineering, or a related field.
• 8+ years of experience in IT, with at least 5 years of hands-on experience with AWS cloud technologies.
• Proven expertise in designing and implementing data platforms on AWS, including EC2, S3, EMR, Redshift, and other relevant services.
• Deep understanding of Databricks architecture, including Delta Lake, Spark, and MLflow.
• Strong knowledge of data engineering principles, data warehousing, and data modeling.
• Experience with data integration, ETL/ELT processes, and data quality management.
• Proficiency in scripting languages (Python, Scala) and SQL.
• Excellent communication and interpersonal skills to effectively collaborate with cross-functional teams.
• Ability to thrive in a fast-paced, agile environment.