Job Description
Donato Technologies, established in 2012, excels as a comprehensive IT service provider renowned for delivering an exceptional staffing experience and prioritizing the needs of both clients and employees. We specialize in staffing, consulting, software development, and training, catering to small and medium-sized enterprises. While our core strength lies in Information Technology, we also deeply understand and address the unique business requirements of our clients, leveraging IT to effectively meet those needs. Our commitment is to provide high-quality, customized solutions using the optimal combination of technologies.
Job Summary –
- experience building/implementing data pipelines using Databricks or similar cloud database.
- Experience in DLT
- Expert level knowledge of using SQL to write complex, highly-optimized queries across large volumes of data.
- Hands-on object-oriented programming experience using Python is required.
- Professional work experience building real-time data streams using Spark and Experience in Spark.
- Knowledge or experience in architectural best practices in building data lakes
- Experience with 2 or more scripting languages such as Python, Bash
- A solid understanding of containerization and orchestration technologies such as Docker, Kubernetes, or ECS
- Understanding of infrastructure as code paradigm, specifically provisioning AWS infrastructure utilizing Terraform and/or Cloud Formation
Skills
PRIMARY COMPETENCY : Big Data Technologies PRIMARY SKILL : Databricks
Please share your resumes at resumes@donatotech.net