Snowflake DBT Architect

job
  • Tredence Inc.
Job Summary
Location
Seattle ,WA 98127
Job Type
Contract
Visa
Any Valid Visa
Salary
PayRate
Qualification
BCA
Experience
2Years - 10Years
Posted
23 Jan 2025
Share
Job Description

Job Title: Snowflake DBT Architect

Location: Seattle

Employment Type: Full-Time

Experience Level: 10 + years

Job Summary:

We are seeking a highly skilled Snowflake DBT Architect to design and implement robust, scalable, and efficient data solutions. The ideal candidate will have extensive experience in Snowflake Data Warehouse, DBT (Data Build Tool), and cloud-based data architecture. This role involves collaboration with cross-functional teams to define data strategies, design workflows, and ensure optimal performance of data pipelines.


Key Responsibilities:

  1. Architect and Design Data Solutions
  • Develop and maintain scalable Snowflake architectures tailored to business requirements.
  • Design end-to-end data pipelines using DBT for data transformation and modeling.
  1. Data Integration and Transformation
  • Implement ETL/ELT processes to integrate structured and unstructured data from various sources into Snowflake.
  • Use DBT to create and optimize reusable data models and transformations.
  1. Performance Optimization
  • Monitor and fine-tune the performance of the Snowflake environment.
  • Optimize DBT workflows for speed and reliability.
  1. Collaboration and Stakeholder Management
  • Work with business analysts, data scientists, and other stakeholders to understand data requirements.
  • Partner with DevOps and cloud engineering teams to ensure seamless integration.
  1. Governance and Best Practices
  • Establish data governance, security, and quality control measures.
  • Document architecture, workflows, and best practices for data solutions.


Required Skills and Qualifications:

  • Snowflake Expertise: In-depth understanding of Snowflake architecture, performance tuning, and cost optimization strategies.
  • DBT Experience: Advanced knowledge of DBT, including model design, testing, and deployment.
  • Data Engineering: Hands-on experience with SQL, Python, or similar languages for data processing.
  • Cloud Platforms: Proficiency in AWS, Azure, or Google Cloud, with expertise in deploying Snowflake solutions in a cloud environment.
  • ETL/ELT Tools: Familiarity with tools such as Apache Airflow, Informatica, or similar platforms.
  • Data Modeling: Strong knowledge of dimensional modeling, star/snowflake schemas, and modern data warehousing practices.
  • Version Control: Experience with Git for source control and versioning DBT models.


Preferred Qualifications:

  • Snowflake and DBT certifications.
  • Knowledge of CI/CD pipelines for data workflows.
  • Experience with BI tools like Tableau, Power BI, or Looker.
  • Understanding of DataOps methodologies.


Soft Skills:

  • Strong problem-solving and analytical thinking.
  • Excellent communication and collaboration skills.
  • Ability to work in fast-paced, dynamic environments.


Why Join Us?

  • Work on cutting-edge data technologies.
  • Collaborate with talented professionals in a dynamic team.
  • Opportunities for growth and learning in data architecture.

Other Smiliar Jobs
 
  • Dallas, TX
  • 8 Days ago
  • Seattle, WA
  • 8 Days ago
  • Indianapolis, IN
  • 7 Days ago
  • Bethesda, MD
  • 5 Days ago