Description
We are offering an exciting opportunity for a Data Engineer to join our team in Columbus, Ohio, United States. The successful candidate will play a pivotal role in our IT team, providing data solutions that support business strategies. This role involves working closely with the corporate business intelligence teams to develop, maintain, and enhance data solutions that enable strategic decision-making.
Responsibilities
- Collaborate with the Data Architect to develop, maintain, and improve data solutions for analytics and business intelligence.
- Design and implement high-performance data ingestion pipelines from multiple sources using Databricks and Azure-native solutions.
- Develop and document ETL/ELT architecture, including CDC, mappings, data flows, procedures, schedules, ensuring overall integration integrity.
- Work with source system and business subject matter experts to understand the data requirements and options available within data sources to meet data and business needs.
- Design and implement a semantic layer based on reporting requirements from the BI and business teams.
- Analyze data quality upstream and downstream to establish trust in data solutions.
- Work with external vendor teams to support BI initiative roll-out across multiple phases.
- Provide support for strategic decisions related to data and analytics, long-term BI investments, and the toolsets that will support these new initiatives.
- Engage in meetings necessary to perform duties and aid business and technical development.
- Utilize skills in Apache Kafka, Apache Spark, Cloud Technologies, Database, EO/IR systems, Algorithm Implementation, Analytics, Apache Hadoop, API Development, AWS Technologies, and Azure Databricks.
Requirements
- Experience with Azure Databricks is necessary
- Proficiency in Cloud Technologies is required
- Strong understanding of Database structures and management
- Experience with EO/IR systems is a must
- Ability to implement algorithms effectively
- Strong analytical skills are needed
- Knowledge of Apache Hadoop is mandatory
- Experience in API Development is required
- Proficiency in AWS Technologies is essential