Key Responsibilities:
- Develop and maintain data processing applications using Scala.
- Collaborate with cross-functional teams to understand data requirements and design efficient solutions.
- Implement test-driven deployment practices to enhance the reliability of the application. Deploy artifacts from lower to higher environment ensuring smooth transition.
- Troubleshoot and debug Spark performance issues to ensure optimal data processing.
- Work in an agile environment, contributing to sprint planning, development and delivering high-quality solutions on time.
- Provide essential support for production batches, addressing issues and providing fixes to meet critical business needs.
Skills/Competencies:
- Strong knowledge of Java & Scala.
- Excellent problem-solving and analytical skills.
- Proficiency in Spark, including the development and optimization of Spark applications.
- Ability to troubleshoot and debug performance issues in Spark.
- Understanding of design patterns and data structure for efficient data processing.
- Familiarity with database concepts and SQL.
- Experience with test-driven deployment practices (Good to have).
- Familiarity with Python (Good to have).
- Knowledge of Databricks (Good to have).
- Understanding of Snowflake for data storage and retrieval (Good to have).
- Understanding of DevOps practices (Good to have).