Job Responsibilities:
- Partner with Data Scientists and Data Engineers to operationalize machine learning and optimization models that deliver new insights to the business
- Build data APIs and data delivery services that support critical operational and analytical applications for internal business operations, customers, and partners
- Take responsibility for ensuring that model code, data pipelines, API’s, and user interfaces are developed and deployed successfully into production
- Troubleshooting issues that arise and supporting production applications
- Continuously integrate code into on premise and AWS cloud environments
- Transform data science protypes into “production-ready” software products
- Ensure a good data flow between database and backend systems
- Optimize solutions for performance and scalability
- Ensure that the methodology, standards, and procedures are implemented
- Ensure that solutions meet the customers' business goals and that customer satisfaction with the project and conclusion is high.
Required Skills & Experience:
- 5-7 years of experience hands-on experience designing, developing, integrating, and running business, big data and/or data science applications
- 1-2+ years of hands-on experience working with AWS as a developer
- Experience developing Angular applications
- Experience developing with Java, SQL and building REST APIs
- Experience working with AWS services with an emphasis on managed services such as Lambda, DynamoDB, SQS, EventBridge, Step Functions, Aurora, S3, and API gateway
- Experience developing infrastructure-as-code using AWS tools (e.g., Cloud Development Kit, Cloud Formation)
- Experience with automating application deployment, continuous delivery, and continuous integration (Git, GitLab, Jenkins)
- Experience using Agile/Scrum methodologies
- Experience with backlog management tools (e.g., VersionOne, JIRA).
- Strong problem-solving skills and capability to understand and set direction for complex technology integration