Who We Are
For more than 20 years, we have been working with organizations large and small to help solve business challenges through technology. We bring a unique combination of engineering and strategy to Make Data Work for organizations.
Our clients range from the travel and leisure industry to publishing, retail and banking. The common thread between our clients is their commitment to making data work as seen through their investment in those efforts.
In our quest to solve data challenges for our clients, we work with large enterprise, cloud based and marketing technology suites. We have a deep understanding of these solutions so we can help our clients make the most of their investment in an efficient way to have a data driven business.
Softcrylic now joins forces with Hexaware to Make Data Work in bigger ways!
Why Work at Softcrylic?
Softcrylic provides an engaging, team-focused, and rewarding work environment where people are excited about the work they do and passionate about delivering creative solutions to our clients.
We are looking to add a Senior Data Engineer (with Azure Databricks) to our team!
Job Summary:
We are seeking a highly experienced Senior Data Engineer with a minimum of 10 years in IT, specializing in data warehousing, ETL processes, and cloud technologies, particularly on the Azure platform. The ideal candidate will have extensive hands-on experience with Azure Data Factory, Databricks, PySpark/Python, and a deep understanding of data warehousing principles.
Key Responsibilities:
- Lead the design, development, and deployment of scalable data solutions on Azure.
- Collaborate with cross-functional teams to gather requirements and translate them into technical solutions.
- Design and implement robust ETL pipelines using ADF and Databricks, ensuring data quality and integrity.
- Develop and optimize PySpark/Python code for data processing tasks.
- Provide technical leadership and mentorship to junior team members, fostering a collaborative and knowledge-sharing environment.
- Create and maintain comprehensive documentation, including design specs, STTM, and technical specifications.
- Ensure data solutions are aligned with industry best practices and compliance requirements, particularly in banking domains, if applicable.
Primary Skills:
- Azure: Extensive hands-on experience with Azure Cloud services, including Azure Data Factory (ADF), Synapse Analytics, and related Azure components.
- Databricks: Proficiency in working with Databricks for data processing, transformation, and analytics.
- ADF (Azure Data Factory): Expertise in designing and implementing ETL pipelines using ADF for large-scale data processing.
- PySpark/Python: Advanced programming skills in PySpark and Python for data engineering tasks.
Secondary Skills:
- Data Warehousing: In-depth knowledge of data warehousing concepts, architecture, and implementation, including experience with various data warehouse platforms.
Must-Have Qualifications:
- 10+ Years of IT Experience: Proven track record of working in IT with a focus on data engineering, data warehousing, and ETL processes.
- Cloud Technologies: Extensive experience with cloud-based data solutions, particularly on the Azure platform, including ADF, Synapse, and Databricks.
- Design and Documentation: Ability to understand and interpret complex design requirements, source-to-target mapping (STTM), and create detailed technical specification documents.
- Client-Site Flexibility: Willingness and ability to operate from client office locations as required, ensuring seamless collaboration and project delivery.
- Mentorship: Capable of mentoring and guiding junior resources, providing technical support and knowledge transfer as needed.