USC only
2 Zoom calls
Remote
It is 6 months to hire/perm. Also, these guys are very technical and TOUGH INTERVIEWERS so if you have any concerns about a candidate’s communication skills or their technical background, I’d rather you not send them. They will get swallowed up by these guys…I can’t emphasize tough enough.
Required Skills (please Provide These Details For Submission)
Python <<<Years of experience:Azure DF and Data lakes experience <<<Years of experience:SQL <<<Years of experience:Minimum of five years of experience in data integration and ETL processes required. <<<Years of experience:Strong experience with Azure Data Factory and Azure services, such as Azure Blob Storage, Azure SQL Database, and Azure Data Lake Storage required. <<<Years of experience:Bachelor's degree in Computer Science, Engineering, or a related field required. <<<Please make sure this is on resumeExperience with SQL and programming languages, such as Python and PowerShell required. <<<Which ones? Years of experience:
Desired Skills (not Required)
They are tough interviewers. Candidates MUST go under the hood and really show what they know with data factory. Not simple one word answers. PowerBI <<<Years of experience:
Day To Day Responsibilities
- The Data Engineer is responsible for modeling the 3 Company values of Compassion, Integrity, and Excellence, and for promoting the Compassus philosophy, using the 6 Pillars of success as the foundation. S/he is responsible for upholding the Code of Ethical Conduct and for promoting positive working relationships within the company, among all departments, and all external stakeholders. The Data Engineer will be responsible for designing, developing, and maintaining data pipelines in Azure Data Factory to support our organization's data structures like Enterprise Data Warehouse, Data Marts and Semantic layer data. S/he will utilize their strong background in data, design, development, integration, testing and ETL processes, as well as expertise with Azure Data Factory. Position Specific Responsibilities
- Design and develop data pipelines using Azure Data Factory to support data integration and ETL processes.
- Work with business stakeholders to understand data requirements and translate them into technical specifications.
- Develop and maintain data processing and transformation logic using SQL in Azure Data Factory.
- Optimize data pipelines for performance, scalability, and reliability.
- Troubleshoot issues with data pipelines and provide timely resolutions.
- Implement data security and access controls to protect sensitive data.
- Collaborate with cross-functional teams to develop and integrate data from various sources and systems.
- Document data pipeline design, development, and maintenance processes.
- Perform other duties as assigned.
- Familiarity with data warehousing and BI tools, such as Power BI and Azure Analysis Services preferred
- Familiarity with Azure Data Factory and Azure services is a plus.