Global Channel Management is a technology company that specializes in various types of recruiting and staff augmentation. Our account managers and recruiters have over a decade of experience in various verticals. GCM understands the challenges companies face when it comes to the skills and experience needed to fill the void of the day-to-day function. Organizations need to reduce training and labor costs while requiring the best talent for the job.
Qualifications
The Hadoop Developer needs 5+ years of experience with Python and Java/Scala.
Hadoop Developer requires:
- B.S. and M.S. in mathematics, computer science, or engineering
- 3+ years of demonstrated technical proficiency with SPARK, big data projects, and experience in data modeling
- Experience designing and developing data ingestion and processing/transformation frameworks leveraging tools such as ADF, NiFi, Sqoop, and Eclipse
- Experience in the Big Data space for at least 5 years with Apache Spark, Hive, and MapReduce. Experience with a variety of Big Data ETL tools, e.g., Spark, Hive, Kafka, Sqoop, MapReduce, Scala, Zookeeper, etc.
- Experience with Cloudera or Hortonworks; exposure to Teradata is beneficial
Hadoop Developer Duties:
- Translate complex functional and technical requirements into detailed design
- Proficiency in SPARK for technical development and implementation
- Load disparate data sets by leveraging various big data technologies (Kafka)
#J-18808-Ljbffr