Rek ID
BD-25-1076-OCDDE-OCD01-110646,
Title
Data Architect (Snowflake)
Client
State of MA - Executive Office of Housing and Livable Communities
Duration
12+ Months
Location
Boston, MA - 4 days onsite in a Month
Description of Duties:
- Design, develop, and maintain complex snowflake data warehouses, including data modeling, ETL processes, and data quality control.
- Ensure data integrity, quality, and security across all data warehousing activities.
- Develop and maintain data pipelines to integrate data from various internal and external sources.
- Develop and maintain data pipelines using tools for data transformation, testing, and deployment. Ensure data pipelines are scalable, reliable, and efficient.
- Write Python code to automate data processing, transformation, and loading tasks, including data ingestion, data quality control, and data visualization.
- Collaborate with data analysts and business stakeholders to understand data requirements and develop data solutions that meet business needs.
- Develop and maintain data visualizations and reports using Snowflake's built-in visualization tools or third-party tools like Tableau.
- Ensure data quality, integrity, and security by developing data validation rules, data cleansing processes, and data access controls.
- Optimize snowflake data warehouse performance, scalability, and reliability, including monitoring and troubleshooting data issues.
- Develop and maintain technical documentation, including data dictionary, data flow diagrams, and code comments.
- Participate in code reviews and provide feedback to ensure high-quality code and adherence to coding standards.
- Work closely with the agency’s data team and the overall state technology team to ensure alignment on data strategy and technology standards.
- Communicate complex data findings to non-technical stakeholders in a clear and concise manner
- Provide technical guidance and mentorship to junior analysts and other team members
Qualifications:
- 8+ years of experience in Snowflake data warehouse development and management, data engineering field.
- Strong understanding of Snowflake architecture, data modeling, and data warehousing concepts.
- Proficiency with SQL, including snowflake’s SQL dialect and complex query writing for performance optimization.
- Experience with Snowflake features, including cloning, time travel, data sharing, and micro-partitioning.
- Strong understanding of Snowflake’s architecture, including virtual warehouses, storage, and compute separation.
- Expertise in Snowflake Role-Based Access Control (RBAC), data masking, and encryption.
- Proficiency with data integration tools and ETL/ELT processes.
- Experience in Python or similar scripting languages for automation and data processing.
- Knowledge of data security and compliance standards as they relate to Snowflake.
- Experience with cloud platforms like AWS, Azure.
- SnowPro Advanced certification is a plus.
- Knowledge of data governance frameworks and tools.
- Strong experience integrating Snowflake with Tableau, Salesforce, third party APIs etc.
- Strong analytical, problem-solving, and communication skills
- Ability to work collaboratively in a cross-functional team environment
- Excellent verbal and written communication skills with project teams and business teams
- Team-oriented attitude and the proven ability to collaborate at all levels of an organization
Desired Skills/Experience:
- SnowPro Advanced Architect or SnowPro Data Engineering certification is preferred.
- Experience in building complex data pipelines in an multi cloud environment.
- Experience implementing data masking, and aggregation to reduce privacy risks
- Experience integrating data from Salesforce solutions
- Knowledge of CI/CD practices and tools.
- Familiarity JIRA or other project management tools
- Experience using GitHub
- Experience working on projects in the affordable housing sector or other public benefit programs