Job Title: Data Engineer - W2 Only - We can provide sponsorship
Duration: Long Term
Location: Durham, NC - Hybrid - 2 weeks in a month onsite - Local
This is a Data Engineer who will be doing Snowflake development and building ETL pipelines between Oracle and the Snowflake analytics platform.
Spark using Scala is the biggest must have.
Must haves :
- Understanding OOPs concepts
- Working exp writing Spark using Scala
- AWS – EMR
- Lambda
- SQL Basic understanding of Hadoop Ecosystem & HDFS File System
The Expertise We’re Looking For:
- Bachelor’s or Master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with minimum 8+ years of design and development experience.
- Strong experience with these database technologies – Snowflake, Oracle
- Experience in Object-Oriented Software development with Java and/or Python
- Hands on experience with Spark (Java or Scala), use of AWS EMR as the infrastructure for execution both an advantage
- Experience in Cloud technologies (AWS), including Docker and EKS
- Experience building scalable and robust ETL solutions, Business Intelligence, Databases or Data lakes such as Snowflake. Expertise in SQL.
- Strong design and analysis skills for a large data platform
- Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, GitHub, Terraform, Docker, Artifactory etc.)
- Experience of working within an agile scrum development environment
- Excellent interpersonal and communication skills
- Excellent collaboration skills to work with multiple teams in the organization.
- Financial Services industry experience preferred but not essential.