Data Developer

job
  • Autodesk, Inc.
Job Summary
Location
,ON
Job Type
Contract
Visa
Any Valid Visa
Salary
PayRate
Qualification
BCA
Experience
2Years - 10Years
Posted
16 Nov 2024
Share
Job Description
Conversion US P2 Data Developer
Job Requisition ID # 24WD82787
Data Developer @ Autodesk
As a global leader in 3D design, engineering, and entertainment software, Autodesk helps people imagine, design, and create a better world. Autodesk accelerates better design through an unparalleled depth of experience and a broad portfolio of software, giving customers the power to solve their design, business, and environmental challenges. In addition to designers, architects, engineers, and media and entertainment professionals, Autodesk helps students, educators, and casual creators unlock their creative ideas through user-friendly applications.
The Data Science and Machine Learning (DXC-DS) team is responsible for delivering Machine Learning capabilities that transform the digital customer journey across the Sales, Marketing, and Customer Success functions at Autodesk.
We aim to help our customers through conversation, personalization, and other machine learning-empowered features. Our team members work in the pre-modeling and post-modeling stages with a specific focus on Data Engineering and Machine Learning Operationalization. Our team collectively builds and maintains production Machine Learning systems driving rich customer interactions and better business outcomes. We work with modern technology in a fast-paced, agile environment, in a supportive and collaborative way, across locations and time zones.
Position Overview
Data Engineers will be responsible for designing, building, maintaining, monitoring, integrating, and testing scalable data pipelines in production. You will partner with multi-disciplinary teams such as Machine Learning Engineering, Platform Engineering, Data Analysts on coordinating delivery of pipelines for specific business needs. Your work will contribute to strategic initiatives such as optimization of digital conversion metrics and development of Autodesk Assistant, an LLM-driven chatbot intended to answer customer inquiries.
Our team culture is built on collaboration, mutual support, and continuous learning. We emphasize an agile, hands-on, and technical approach at all levels of the team. As a group, we want to continuously improve our work as well as our knowledge of trends and techniques relevant to our areas. We encourage personal development and knowledge sharing.
Responsibilities
Data Pipeline Construction: Design, build, and maintain scalable data pipelines to support continuous data flow and analytics
Data Modeling: Develop and implement effective data models and schemas to ensure data integrity and accessibility
Database Management: Oversee and optimize database systems (SQL, NoSQL) ensuring high performance and availability
Data Integration: Integrate data from various sources, ensuring consistency, quality, and reliability
ETL Processes: Design and manage Extract, Transform, Load (ETL) processes to move data between systems
Data Governance: Implement and enforce data governance policies to ensure data security, privacy, and compliance
Collaboration: Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver solutions
Performance Tuning: Enhance and tune database performance for efficient processing and querying of large datasets
Documentation: Maintain comprehensive documentation for data processes, models, and architecture
Platform Mindset: Partner with internal platform team on the following tasks
Minimum Qualifications
Bachelor's degree in computer science, Information Technology, or a related field. A master's degree is a plus.
Proficiency in programming languages such as Python, Java, or Scala
Strong knowledge of SQL and experience with relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra)
Knowledge of big data technologies and frameworks (e.g., Kafka, Flink, Parquet, Iceberg, etc.)
ETL Tools: Skilled in using ETL tools like Apache Airflow
Experience with cloud services from AWS, Azure, or Google Cloud, including data-related services like AWS Glue, EMR, Redshift, and S3
Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery
Understanding of data modeling, data architecture, and ETL processes
Experience with version control systems (e.g., Git) and continuous integration/continuous deployment pipelines
Relevant certifications in big data technologies, cloud platforms, or database management are a plus
Preferred Qualifications
Experience in customer journey analytics and personalization
Familiarity with machine learning concepts and collaboration with data science teams
Knowledge of real-time data processing and streaming architectures
Experience with containerization tools like Docker and orchestration platforms like Kubernetes
#J-18808-Ljbffr
Other Smiliar Jobs
 
  • Vancouver, BC
  • 4 Days ago
  • Vancouver, BC
  • 4 Days ago
  • Montreal, QC
  • 4 Days ago
  • , ON
  • 4 Days ago
  • Vancouver, BC
  • 4 Days ago
  • , ON
  • 4 Days ago
  • , ON
  • 4 Days ago
  • Toronto, ON
  • 3 Days ago
  • Toronto, ON
  • 3 Hours ago
  • , ON
  • 3 Hours ago
  • Vancouver, BC
  • 3 Hours ago