Job Title: Data Engineer
Location: Illinois (CST time zone consultant preferred)
Duration: 12+ months
Contract: W2
Experience Level: 3+ years
Responsibilities:
- Design and develop scalable data pipelines and workflows using tools such as Apache AirFlow.
- Implement near real-time data processing solutions and data models.
- Utilize PySpark or PySparkSQL for data transformation and manipulation.
- Work with Databricks for big data processing and analytics.
- Implement data integration solutions using Azure Synapse and Azure Data Factory.
- Implement and maintain code versioning, CI/CD pipelines for efficient data pipeline deployment.
- Collaborate with data scientists and analysts to understand data requirements and deliver appropriate solutions.
- Monitor and optimize data pipeline performance, troubleshoot issues, and ensure data quality and reliability.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or a related field; or equivalent practical experience.
- 3+ years of experience in a Data Engineering role.
- Proficiency with data workflow orchestrators such as Apache AirFlow.
- Hands-on experience building and optimizing near real-time data pipelines and models.
- Strong programming skills in Python, with experience in PySpark or PySparkSQL.
- Experience working with Databricks for big data processing.
- Familiarity with cloud-based data platforms like Azure Synapse and Azure Data Factory.
- Experience with code versioning tools (e.g., Git) and implementing CI/CD pipelines.
- Strong analytical and problem-solving skills, with the ability to quickly adapt to new technologies and environments.
- Effective communication skills and ability to work collaboratively in a team environment.
- Experience in Agile development methodology is a plus.
Note: This position is based in Illinois and requires availability during CST business hours. Candidates must be authorized to work in the United States without sponsorship.