Only U.S. citizens are eligible due to ITAR regulations.
* Implement and manage infrastructure as code (IaC) using AWS CDK or CloudFormation.
* Collaborate with data scientists and analysts to understand data requirements and translate them into technical specifications.
* Optimize data storage and retrieval, ensuring high performance and reliability.
* Monitor, troubleshoot, and improve data pipelines to handle large-scale data processing.
* Ensure data quality and integrity across all data sources and pipelines.
* Stay up-to-date with emerging technologies and best practices in data engineering and cloud services.
Required Skills and Qualifications:
* Proven experience with AWS services (e.g., S3, Lambda, Redshift, RDS).
* Strong hands-on experience with AWS CDK or CloudFormation for infrastructure as code.
* Proficiency in Apache Spark for large-scale data processing.
* Experience with Airflow for orchestrating complex data workflows.
* Solid understanding of data modeling, ETL processes, and data warehousing concepts.
* Proficiency in Python or a similar programming language.
* Familiarity with version control systems like Git.
* Excellent problem-solving skills and attention to detail.
* Strong communication and collaboration skills.