Python Engineer

job
  • GalaxE.Solutions
Job Summary
Location
Toronto ,ON C6A
Job Type
Contract
Visa
Any Valid Visa
Salary
PayRate
Qualification
BCA
Experience
2Years - 10Years
Posted
08 Jan 2025
Share
Job Description

What you will do:

  • Build a Critical Foundation: Lead the development of a critical data quality framework that will impact how capital markets data is managed across the organization.
  • Autonomy and Ownership: As the sole architect for the first several months, will have significant responsibility and the freedom to shape the initiative's direction.
  • Growth Opportunities: As the project expands, there will be opportunities to grow the team, scale the initiative, and take on a leadership role in the longer-term data contracts strategy.
  • Collaborate with Leadership: Work directly with our team of technical leaders with deep expertise, in a highly collaborative environment.
  • Design and Implement Data Quality Frameworks: Adopt a proactive (and reactive), scalable data quality framework to ensure that all critical capital markets data flows meet high standards of accuracy, completeness, and timeliness.
  • Collaborate with Data Engineers and Business Teams: Work with data engineering and business teams to identify key data quality issues, establish validation rules, and implement solutions that prevent data quality problems before they arise.
  • Develop Monitoring and Automation Tools: Implement automated data quality checks and monitoring systems that flag and address data issues in real time. Leverage tools like Great Expectations or similar frameworks to validate data integrity across pipelines.
  • Define Data Quality Metrics and SLAs: Establish clear metrics for measuring data quality (e.g., accuracy, completeness, consistency) and set up service level agreements (SLAs) that data producers and consumers can use to track performance.
  • Pilot Initial Data Flows: Lead a pilot focused on a few key data streams (such as risk analytics and trade execution data) to refine the framework, demonstrate value, and gain early insights into how proactive data quality can support downstream data use cases.
  • Drive Data Governance: Build the governance structures that will ensure long-term data quality, including clear data ownership and stewardship responsibilities, processes for addressing data quality issues, and reporting tools for business and technical teams.
  • Support Long-Term Vision for Data Contracts: While the primary focus is on proactive data quality, this work will lay the foundation for establishing formal data contracts in the future, where data quality metrics and SLAs become formalized agreements between teams.

Required:

  • 7+ years of experience in data quality, data architecture, or a similar role.
  • Proven experience designing and implementing data quality frameworks that ensure accuracy, completeness, and timeliness of data.
  • Strong skillset and experience in building full stack in Python
  • Understanding of cloud infrastructure (GCP or Azure) and how to implement scalable data quality systems within cloud-native architectures.
  • Ability to work closely with data engineers to support their data pipeline development by embedding data quality checks and standards.
  • Experience establishing and tracking data quality metrics and SLAs and working with teams to ensure compliance.
  • Proficiency in data engineering languages (Python) to support the creation and automation of data quality check patterns in multiple clients.

Desired:

  • Familiarity with data governance frameworks and how data quality ties into governance processes.
  • Expertise in data validation tools such as Great Expectations, whylogs, dbt expectations, Apache Griffin, or custom-built solutions for real-time data quality checks.
  • Experience working in environments that emphasize data accountability and clear data ownership across multiple teams and systems.
  • Hands-on experience in capital markets, financial services, or similar high-performance data environments.
  • Familiarity with capital markets data structures (e.g., trades, pricing, risk data) and the unique challenges around managing and ensuring quality in high-frequency, transactional data environments.
  • Database: Dremio / SQL / NoSQL
  • Hands-on experience in capital markets, financial services, or similar high-performance data environments.

Other Smiliar Jobs
 
  • Toronto, ON
  • 6 Days ago
  • Toronto, ON
  • 6 Days ago
  • Toronto, ON
  • 6 Days ago
  • Alpharetta, GA
  • 5 Days ago
  • Morris Plains, NJ
  • 3 Days ago
  • Berkeley Heights, NJ
  • 6 Days ago
  • Toronto, ON
  • 6 Days ago
  • Hartford, CT
  • 3 Days ago