Zaddy Solutions is searching for a several different levels of Java based Software Developers for a large enterprise digital transformation initiative with a client of ours in the Capital Markets sector. These roles are long term contract / to hire. These roles are hybrid and can be located in CLT, NYC, or ATL.
Initiative: Foundational Data Hub Development
Join a dynamic team tasked with designing and implementing a cutting-edge foundational data hub to support near real-time risk reporting, analytics, and business reporting . This data hub will integrate CIB (Corporate and Investment Banking) systems of record and Reference Data, enabling Regulatory Trade Reporting and Risk Reporting across various business units, including:
- Derivatives (interest rates, credit, commodities)
- Foreign Exchange
- Equities
- Fixed Income
- Loans
- Investment Banking / Capital Markets
Role: Senior Developer and Development Lead
We are looking for motivated and skilled Software Engineers with expertise in Java , Kafka , Apache Flink , and MQ technologies to design, develop, and maintain a state-of-the-art data hub. This system will facilitate event-driven communication and real-time data streaming , ensuring high-performance and reliability for critical financial operations.
Key Responsibilities:
Design and Development:
- Lead the technical design, analysis, and development of the foundational data hub across CIB Technology.
- Build and optimize real-time messaging applications using Kafka for high-throughput, low-latency data processing.
- Implement Kafka producers/consumers to handle data streams in a trading environment.
- Leverage Apache Flink for real-time stream processing , including event-driven data transformations and aggregations.
- Integrate Kafka with trading platforms and financial systems, ensuring seamless interoperability.
System Integration:
- Develop and integrate Apache messaging frameworks (e.g., ActiveMQ, Kafka) for reliable, high-performance messaging.
- Work with MQ systems (e.g., IBM MQ, Tibco EMS) to ensure reliable message queuing and real-time data flow between distributed systems.
Performance and Maintenance:
- Troubleshoot and optimize Kafka and related messaging technologies for high-frequency trading scenarios .
- Collaborate with DevOps and SecOps teams for deployment, monitoring, and maintenance of Kafka and Flink clusters.
Continuous Improvement:
- Stay up to date with best practices and emerging trends in real-time messaging, stream processing, and fault-tolerant systems.
- Assist in the migration and enhancement of messaging solutions, ensuring scalability and reliability.
Qualifications:
Education:
- Bachelor’s degree in Computer Science, Software Engineering, or a related field.
Technical Expertise:
- Proficiency in Java with a strong focus on real-time messaging and Kafka.
- Hands-on experience with Kafka integration for high-volume, low-latency environments.
- Advanced knowledge of Apache Flink for real-time stream processing and analytics.
- Familiarity with event-driven architecture , distributed systems, and fault-tolerance principles.
- Experience with Apache messaging tools (e.g., ActiveMQ, Kafka) and MQ systems (e.g., IBM MQ, Tibco EMS).
Additional Skills (Preferred):
- Experience with Docker , Kubernetes , and microservices architecture.
- Strong understanding of message queuing systems and fault-tolerant design.