Snowflake SME

job
  • Resource Informatics Group
Sorry the Job you are looking for is no Longer available

Job Summary
Location
Baltimore ,MD
Job Type
Contract
Visa
Any Valid Visa
Salary
PayRate
Qualification
BCA
Experience
2Years - 10Years
Posted
25 Dec 2024
Share
Job Description

Role: Snowflake SME
Location: Baltimore, MD- Initial Remote
Duration: 6+ months
Rate: $market All Inclusive
Job Description
Skills: (All Skillsets at least 10 years)
Snowflake SME
Snowflake space
ETL and informatic nice to have.
• Knowledge of SQL language and cloud-based technologies (SQL - 10 years)
• Data warehousing concepts, data modeling, metadata management
• Data lakes, multi-dimensional models, data dictionaries
• Performance tuning and setting up resource monitors
• Snowflake modeling - roles, databases, schemas
• SQL performance measuring, query tuning, and database tuning
• ETL tools with cloud-driven skills
• Ability to build analytical solutions and models
• Root cause analysis of models with solutions
• Hadoop, Spark, and other warehousing tools
• Managing sets of XML, JSON, and CSV from disparate sources
• SQL-based databases like Oracle SQL Server, Teradata, etc.
• Snowflake warehousing, architecture, processing, administration
• Data ingestion into Snowflake
• Enterprise-level technical exposure to Snowflake applications
Responsibilities:
• Create, test, and implement enterprise-level apps with Snowflake
• Design and implement features for identity and access management
• Create authorization frameworks for better access control
• Implement Client query optimization, major security competencies with encryption
• Solve performance issues and scalability issues in the system
• Transaction management with distributed data processing algorithms
• Possess ownership right from start to finish
• Build, monitor, and optimize ETL and ELT processes with data models
• Migrate solutions from on-premises setup to cloud-based platforms
• Understand and implement the latest delivery approaches based on data architecture
• Project documentation and tracking based on understanding user requirements
• Perform data integration with third-party tools including architecting, designing, coding, and testing phases
• Manage documentation of data models, architecture, and maintenance processes
• Continually review and audit data models for enhancement
• Maintenance of ideal data pipeline based on ETL tools
• Coordination with BI experts and analysts for customized data models and integration
• Code updates, new code development, and reverse engineering
• Performance tuning, user acceptance training, application support
• Maintain confidentiality of data
• Risk assessment, management, and mitigation plans
• Regular engagement with teams for status reporting and routine activities
• Migration activities from one database to another or on-premises to cloud

Other Smiliar Jobs