Hiring Snowflake Solution Architect
Role Description
As a Snowflake Data Engineer, you will be responsible for designing, implementing, and maintaining data solutions within the Snowflake data platform. Your primary focus will be on developing scalable data pipelines, optimizing data workflows, and ensuring data quality and reliability. You will collaborate closely with data scientists, analysts, and other stakeholders to understand business requirements and translate them into technical solutions. Additionally, you will play a key role in performance tuning, troubleshooting, and optimizing Snowflake databases for efficiency and cost-effectiveness.
As a Solutions Architect, You Will
- Designing and implementing scalable and efficient data pipelines using Snowflake's features and functionalities.
- Developing and maintaining ETL processes to ingest, transform, and load data from various sources into Snowflake.
- Collaborating with data architects and analysts to design and implement data models and schemas optimized for Snowflake.
- Monitoring and optimizing the performance of Snowflake databases to ensure reliability, scalability, and cost-effectiveness.
- Troubleshooting and resolving issues related to data ingestion, transformation, and query performance.
- Implementing and enforcing data governance and security policies within Snowflake.
- Staying up-to-date with the latest features and best practices in Snowflake and incorporating them into the data engineering processes.
- Documenting data pipelines, workflows, and technical solutions for knowledge sharing and future reference.
Our Ideal Solutions Architect Will Have
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Certification in Snowflake (mandatory) or related technologies.
- 5 years as a Data Engineer or similar role, with a focus on building and optimizing data pipelines and workflows.
- Hands-on experience with Snowflake data platform, including data ingestion, transformation, and querying.
- Proficiency in SQL and scripting languages like Python for data manipulation and automation.
- Strong understanding of data modelling concepts and experience in designing and implementing data models for analytics and reporting.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Excellent problem-solving and troubleshooting skills.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Experience with big data technologies such as Hadoop, Spark, or Kafka.
- Knowledge of data warehousing concepts and best practices.
- Familiarity with DevOps practices and tools for continuous integration and deployment.
Skills: solution architecture,snowflake,cloud,azure,google,features,hadoop,spark