Job Description
- 6+ plus years of experience in solutioning data pipeline for large enterprise data Warehouse applications using Azure Data Factory, Data bricks, Data Ingestion, Data Transformation /Computation, Orchestration, Reporting & Data Analytics
- Knowledge of Azure Services such as ADF, Databricks, Event Hub, Delta Lake, Data Lake
- Strong command on Python, Pyspark , SQL & Power BI
- Experience in GitHub , Git Action , Jira end to end data pipeline.
- Strong knowledge in CICD Pipeline for automatic deployments
- Experience in Terraform for Infrastructure Provisioning
- Strong knowledge on design and integration patterns
- Proficient in technical artifacts e.g., Application Architecture, Solution Design Documents, etc
- Strong at analytical and problem-solving skills
- Experience working with multi-vendor, multi-culture, distributed offshore and onshore development teams in dynamic and complex environment.
- Experience in Retail is desired, not mandatory.
- Must have excellent written and verbal communication skills.
Roles & Responsibilities:
- Understand the requirements, current state architecture of the enterprise and create roadmap for the future enhancements accordingly.
- Need to lead & mentor the team and solved technical escalations.
- Need to proactively in communicating the escalated issues to feature team & product managers.
- Create Software Architecture Document, High Level and Low-Level Design document, Nonfunctional requirements for the project.