Job Description:
We are seeking a highly skilled and experienced GCP Data Engineer to join our team. In this role, you will be part of the Data Migration team, focusing on building data pipelines on GCP BigQuery using Dataflow and custom Python scripts. The ideal candidate will have excellent analytical and problem-solving skills, with a keen attention to detail. You must be able to communicate effectively with both technical and non-technical stakeholders. Experience working in Agile development environments is a plus.
Responsibilities:
- Minimum of 5 years of overall experience in IT or professional services, including at least 3 years of relevant experience on GCP. Experience with other cloud platforms is a plus.
- Expertise in building data integration and preparation tools using cloud technologies such as Google Dataflow, Cloud Dataprep, Python, etc.
- Experience with the Google tech stack, including Dataflow, BigQuery, Dataprep, Google Composer, Bigtable, Cloud Storage, Datastore, Spanner, and Cloud SQL.
- Proficiency in Python with an emphasis on building data pipelines and expert knowledge in SQL development.
- Ability to identify downstream implications of data loads/migrations (e.g., data quality, regulatory, etc.).
- Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations.
- Capability to work in a rapidly changing business environment and enable simplified user access to massive data by building scalable data solutions.
- Advanced SQL writing skills and experience in data mining (SQL, ETL, data warehouse, etc.), using databases in a business environment with complex datasets.
- Detail-oriented with strong documentation skills.
- Exposure to Tableau/Looker is a plus.
- Ability to work effectively with others from diverse skill sets and backgrounds.
- GCP certification is a plus.