Description
We are a forward-thinking company dedicated to innovation, teamwork, and excellence. Our core values drive us to deliver impactful solutions while embracing continuous learning and improvement. We are committed to creating an inclusive and dynamic work environment that fosters creativity and growth.
We are seeking a skilled Lead Cloud Data Engineer to support a prominent government agency. In this role, you will collaborate with a team of experts to design, build, and implement data migration solutions, ensuring the highest standards of data quality, security, and integrity.
Key Responsibilities
- Architect & Develop: Design and maintain large-scale data and analytics platforms, including system integrations, data pipelines, data models, and API integrations.
- Data Migration: Create and execute strategies to migrate data from on-premises sources to AWS cloud environments.
- Collaboration: Work closely with Data Architects to provide insights and guidance. Coordinate with team members to ensure effective data management and quality.
- Prototyping & Improvement: Prototype new business use cases, validate technology approaches, and continuously enhance data solutions to support business objectives.
- Data Integrity: Ensure data integrity and quality throughout large-scale migrations and deliver high-quality data assets for transformative business processes.
- Optimization: Reduce costs by developing reusable components and implementing best practices. Lead efforts to re-platform and re-engineer data pipelines from on-premises to the cloud.
Requirements
- Experience: 8+ years in Data Engineering with a focus on cloud environments.
- Certifications: AWS Cloud Certification.
- Technical Skills:
- Leadership: Proven experience leading engineering teams and managing tasks and personnel.
- Data Architecture: Strong understanding of data lake, data lakehouse, and data warehousing architectures in cloud environments.
- Programming: Proficiency in Python for data manipulation, scripting, and automation.
- AWS Services: In-depth knowledge of AWS services relevant to data engineering (e.g., S3, EC2, DMS, Glue).
- ETL Processes: Experience designing and building flexible ETL processes and data pipelines using Python, PySpark, and SQL.
- Automation Tools: Familiarity with workflow management tools like Apache Airflow or AWS Step Functions.
- Databricks: Hands-on experience with Databricks for data ingestion, transformation, and optimization.
- Cloud Infrastructure: Experience with infrastructure as code tools (e.g., Terraform, CloudFormation) and containerization technologies (e.g., Kubernetes).
- Migration Experience: Hands-on experience migrating data from on-premises platforms to cloud environments.
- Scripting: Linux/RHEL server and bash/shell scripting skills.
- Preferred Experience:
- Education: Bachelor's Degree in a related field.
- Advanced Skills: Familiarity with advanced SQL techniques, data streaming frameworks, and big data technologies.
- Certifications: Additional certifications in AWS or related technologies.
- Work Environment: Experience in Agile or DevSecOps environments.
Clearance Requirements
- Must be able to obtain and maintain a Public Trust clearance.
Benefits
- Competitive Compensation: Enjoy competitive paid time off, medical, dental, and vision insurance, and more.
- Professional Growth: Access to education reimbursement for certifications, degrees, and professional development.
- Work-Life Balance: Participate in various activities and events designed to enhance your work experience and support a positive and collaborative environment.
- Diversity & Inclusion: We are committed to equal opportunity and fostering an inclusive workplace for all employees.
How To Apply
To apply, click the "Apply for this Job" button at the bottom of this description or at the top of this page. Please upload your resume and complete all application steps. For alternative application methods or if you need assistance, please contact us.
Employment Type: Full-Time