Job Title: Snowflake Architect
Location: Houston, TX
Role Type: Contract
Contract Length: 12 Months
How to Apply: Please send your resume and contact information to recruiters at 4aitservices dot com
Job Summary:
We are seeking a highly skilled
Snowflake Architect to lead the design, development, and implementation of Snowflake solutions across the organization. The ideal candidate will be responsible for architecting and overseeing the implementation of Snowflake data warehouses, optimizing performance, ensuring data security, and leading data migration efforts. The Snowflake Architect will collaborate closely with data engineers, analysts, and business stakeholders to ensure data strategies align with the business goals.
Key Responsibilities:
- Architect and design Snowflake-based data warehouses, including schema design, table design, and data partitioning strategies.
- Lead and execute data migration projects from legacy systems to Snowflake.
- Develop, optimize, and maintain ETL/ELT pipelines to ensure seamless data integration and flow.
- Ensure high performance and scalability of Snowflake solutions by implementing appropriate indexing, clustering, and partitioning.
- Oversee data security, access control, and compliance efforts, implementing robust governance policies.
- Collaborate with data engineers, analysts, and business stakeholders to design and implement data solutions that meet business requirements.
- Establish best practices for data ingestion, storage, and query optimization in Snowflake.
- Design and implement data pipelines using tools such as AWS Glue, Azure Data Factory, or Informatica to load data into Snowflake.
- Provide technical leadership, mentoring, and knowledge transfer to the team.
Requirements:
- Bachelor s or Master s degree in Computer Science, Data Engineering, or a related field.
- 8+ years of experience in data architecture, database design, and data warehousing.
- 3+ years of experience working with Snowflake and cloud data platforms.
- Strong knowledge of Snowflake architecture, features, and best practices.
- Hands-on experience with ETL/ELT processes, including tools like Apache Airflow, AWS Glue, Azure Data Factory, or similar.
- Proficient in SQL and working knowledge of Python, Scala, or other programming languages used in data engineering.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud, with a focus on Snowflake integration.
- Solid understanding of data security, governance, and compliance frameworks (e.g., GDPR, HIPAA).
- Experience in performance tuning, partitioning, clustering, and materialized views in Snowflake.
- Familiarity with modern BI tools like Tableau, Power BI, or Looker.
- Excellent problem-solving skills and ability to work in a collaborative team environment.
- Strong verbal and written communication skills.
Preferred Qualifications:
- Snowflake certification (e.g., SnowPro Core, SnowPro Advanced Architect).
- Experience in big data technologies such as Spark, Hadoop, or Databricks.
- Knowledge of data modeling methodologies such as Kimball or Inmon.
- Prior experience in building solutions using DBT or similar data transformation tools.