Primary: Databricks Python, Qlik, SQL.
Good to have: Informatica, SQL
• Bachelor's Degree in IT/Systems field or equivalent experience.
• 3 years relevant experience in Data Engineering.
• Demonstrated experience with primary AWS services such as EC2, EMR, S3, IAM policies, Cloudwatch, Cloud Formation, SES
• Demonstrated experience with Cloud Services for data handling and database technologies (Lambda, DMS, Kafka, Spark, Redshift, Hadoop, Airflow, etc.).
• Demonstrated experience with modern programming languages such as Python, Scala or Java
• Experience designing and building systems to process large volumes of heterogeneous data, real time and batch, to deliver complex data solutions that make the most of the elastic capabilities of Cloud services to maximize performance while minimizing waste.
• Command of advanced SQL queries and programming.
• Experience contributing to and following architecture, design and implementation best practices.
• Proven analytical and problem-solving abilities. Ability to assimilate information and quickly discern the most relevant facts and recommend creative, practical design solution. Ability to think outside-the-box a real asset.
• Experience with DEVOPs tools and processes and CI/CD are an asset.
• Demonstrated ability to deliver using Agile/Scrum project methodology.
• Excellent communication, presentation, influencing, and reasoning capabilities.
• Desire to take the initiative moving projects/ideas forward with clarity
• Leadership skills to lead and mentor cross-functional teams towards common solutions.
• Knowledge of legacy data warehousing tools and technology is an asset Examples: Dimensional Models, Informatica PowerCenter, Informatica Cloud, MS Integration Services, Alteryx, Oracle, MS SQL Server etc.