Video
Indianapolis, IN (Hybrid, 3 days a week onsite)
USC
Key Responsibilities
- Develop and Execute POCs: Create proof-of-concepts to showcase the capabilities of existing internal tools and solutions.
- Tool Comparison: Analyze and compare the effectiveness of current tools versus potential new solutions, such as Databricks.
- Data Transformation: Lead the transformation of data applications from Redshift/Stored Procedures to a new Glue/Iceberg framework.
- Data Visualization and Integration: Explore the use of MS Fabric for enhancing data visualization and integration, and evaluate its added value.
Required Skills
- Databricks: Proficiency in Databricks for data engineering, analytics, and machine learning.
- DevOps: Experience with GitHub Actions, AWS CDK, and CI/CD pipeline development.
- SQL/PLSQL: Strong skills in SQL and PLSQL for reading, understanding, and working with code.
- Python Development: Experience in Python programming for data processing.
- AWS Development: Solid experience with AWS services, particularly in data-centric projects.
- PySpark/Glue API: Hands-on experience with PySpark and/or AWS Glue API development.
- Documentation/Communication: Excellent skills in documenting processes and communicating complex concepts effectively.
- MS Fabric: Experience with MS Fabric for data visualization and integration is a plus.