Job Summary
We are looking for an experienced Data Engineer with strong proficiency in Go, Python, AWS, and GCP to design, build, and maintain robust data pipelines and cloud-based data solutions. The ideal candidate will work on the development and optimization of scalable data workflows, ensuring data accessibility and security while supporting cross-functional teams in achieving data-driven objectives.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Go and Python.
- Build and manage cloud-based data infrastructure on AWS and GCP.
- Implement data cleansing, transformation, and integration processes.
- Optimize cloud storage solutions (AWS S3, RDS, GCP BigQuery) for performance and cost.
- Collaborate with data scientists, engineers, and stakeholders to define data needs and solutions.
- Ensure data integrity, security, and compliance within cloud environments.
- Troubleshoot and resolve issues in data pipelines and workflows.
- Automate and streamline data access and processing through workflow automation.
- Develop documentation and best practices for data management and governance.
Required Qualifications
- Strong experience with Go and Python programming languages.
- Expertise in AWS and GCP cloud platforms.
- Solid knowledge of SQL and database management.
- Proven experience building and maintaining scalable data pipelines.
- Familiarity with cloud data services like AWS S3, RDS, and GCP BigQuery.
- Strong understanding of data transformation tools and techniques.
- Excellent problem-solving skills and attention to detail.
Preferred Qualifications
- Experience with additional cloud platforms (e.g., Azure) or containerization tools (Kubernetes).
- Knowledge of data warehousing and ETL processes.
- Familiarity with data governance, privacy, and security standards.
- Experience with CI/CD pipelines and version control tools.
- Strong communication and teamwork skills.
Location: Plano, Mclean, Richmond
Education: Bachelors Degree