Data Engineer - ETL with Python, Hadoop & Snowflake
Responsibilities
- Organize business needs into ETL/ELT logical models and ensure data structures are designed for flexibility to support scalability of business solutions
- Craft and implement data pipelines utilizing Glue, Lambda, Spark, and Python
- Define and deliver reusable components for ETL/ELT framework.
- Define optimal data flow for system integration and data migration
- Integrate new data management technologies and software engineering tools into existing structures
Required Skills/Experience
1 Design, Development, and Implementation of large - scale projects in financial industries using Data Warehousing ETL tools (Pentaho)
Experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling
2 Strong knowledge and experience of SQL, Snowflake, Python and Spark
3 Experience with Big Data/distributed frameworks such as Spark, Kubernetes, Hadoop, and Hive
4 Strong sense of customer service to consistently and effectively address client needs. Self-motivated; comfortable working independently under general direction