Skills: 7+ years. Thought leadership.Java or Python. UI. Data platform or Data analytics web apps. SQL. AWS. bonus: ML, big data, Airflow, Domino
Key skills : Looks like a Senior Java Developer or Python Developer with
- Someone seasoned with thought leadership (does not need leadership) who can do their own designing/planning and development on their own.
- Around 7+ years with more backend experience (either Python or Java)
- Data platforms or data analytics web application
- UI (any UI; will be lighter/secondary)
- SQL
- AWS
Bonus skills : data pipelines integrated w/ ML, Domino, Apache Airflow, big data
JD This team is building an experimentation platform for batch data pipelines used to find examples of non-compliant or manipulative trading behaviors. You will work on cross-functional team consisting of product designers and analysts, and data engineers to design and implement solutions that enable downstream users to experiment with data pipelines in no/low-code manner.
Job Responsibilities:
• Interpret business needs to design a web application that supports business needs
• Act as the technical lead for both design and implementation leveraging industry standards and best-practices
• Develop performant and modular back-end systems to interact with existing Data Lakes and Data Platforms
• Develop responsive front-end systems for use by internal business users
• Shepard releases through the SDLC process through to production
Qualifications:
• 7+ years of experience building production quality web-based multi-tier web applications
• 4+ years experience building restful web services in Java (J2EE) or Python (Django/Flask)
• 2+ years experience building web front-ends with JavaScript UI frameworks such as jQuery, Angular, Node, or similar
• Experience building data platform or data analytics web applications
• Experience with SQL
• 4+ years experience building on Amazon AWS services
• Experience working in agile scrum methodologies using Continuous Integration/Continuous Delivery (CI/CD) development practices
• Experience writing automated unit, component, and integration tests
• Able to adapt to evolving business priorities and requirements
Nice to Have:
• Experience with data pipelines integrated with Machine Learning models
• Hands on experience with data pipelines and big data technologies such as Spark, Kafka, Flink, or similar
• Familiarity with Apache Airflow or similar
• Familiarity with Domino or similar
• Experience with serverless Big Data stacks