Company Overview
We are a forward-thinking organization committed to innovation, teamwork, and excellence. Our core values center on delivering impactful solutions while fostering continuous learning and improvement. We pride ourselves on creating an inclusive and dynamic work environment that promotes creativity and growth.
Position Overview
We are looking for an experienced professional to support a significant government agency. In this role, you will work in collaboration with a team of experts to design, construct, and implement data migration solutions while adhering to the highest standards of data quality, security, and integrity.
Key Responsibilities
- Architect & Develop: Design and maintain large-scale data and analytics platforms, including system integrations, data pipelines, data models, and API integrations.
- Data Migration: Create and implement strategies for migrating data from on-premises sources to AWS cloud environments.
- Collaboration: Collaborate closely with Data Architects to provide insights and coordinate with team members to ensure effective data management and quality.
- Prototyping & Improvement: Prototype new business use cases, validate technology approaches, and enhance data solutions to align with business objectives.
- Data Integrity: Ensure the integrity and quality of data throughout extensive migrations and deliver high-quality data assets for transforming business processes.
- Optimization: Lead efforts to reduce costs by developing reusable components, implementing best practices, and re-platforming data pipelines from on-premises to the cloud.
Requirements
- Experience: Over 8 years in Data Engineering, particularly in cloud environments.
- Certifications: AWS Cloud Certification.
- Technical Skills:
- Proven leadership experience in managing engineering teams and tasks.
- Strong understanding of data lake, data lakehouse, and data warehousing architectures in cloud environments.
- Proficiency in Python for data manipulation, scripting, and automation.
- In-depth knowledge of AWS services pertinent to data engineering (e.g., S3, EC2, DMS, Glue).
- Experience with designing and constructing flexible ETL processes and data pipelines using Python, PySpark, and SQL.
- Familiarity with workflow management tools such as Apache Airflow or AWS Step Functions.
- Hands-on experience with Databricks for data ingestion, transformation, and optimization.
- Knowledge of infrastructure as code tools (e.g., Terraform, CloudFormation) and containerization technologies (e.g., Kubernetes).
- Experience in migrating data from on-premises platforms to cloud environments.
- Competency in Linux/RHEL server and bash/shell scripting.
- Preferred Experience:
- Education: Bachelor's Degree in a related field.
- Advanced Skills: Familiarity with advanced SQL techniques, data streaming frameworks, and big data technologies.
- Certifications: Additional certifications in AWS or related technologies.
- Work Environment: Experience in Agile or DevSecOps environments.
Clearance Requirements
- Eligible to obtain and maintain a Public Trust clearance.
Benefits
- Competitive Compensation: Competitive paid time off and comprehensive medical, dental, and vision insurance.
- Professional Growth: Access to education reimbursement for certifications, degrees, and professional development opportunities.
- Work-Life Balance: Engage in activities and events designed to enhance your work experience and promote a collaborative environment.
- Diversity & Inclusion: Committed to equal opportunity and fostering an inclusive workplace for all employees.
Application Process
To apply for this opportunity, please click the "Apply for this Job" button at the bottom or top of this page. Upload your resume and complete all necessary application steps. For alternative application methods or assistance, please reach out to us.
Employment Type: Full-Time