Big Data Specialist

job
  • Impetus
Job Summary
Location
Phoenix ,AZ
Job Type
Contract
Visa
Any Valid Visa
Salary
PayRate
Qualification
BCA
Experience
2Years - 10Years
Posted
03 Feb 2025
Share
Job Description

Job Description

At Impetus Technologies, we drive innovation and deliver cutting-edge solutions to our clients. We are hiring an experienced Big Data Engineer with a strong focus on GCP/Azure/AWS to join our team in Phoenix, AZ. The ideal candidate will have extensive experience in Hadoop, Spark (Batch/Streaming), Hive, and Shell scripting and solid programming skills in Java or Scala. A deep understanding and hands-on experience with GCP/Azure/AWS are critical for this role.

Qualifications:

  • Proven experience with Hadoop, Spark (Batch/Streaming), and Hive.
  • Proficiency in Shell scripting and programming languages such as Java and/or Scala.
  • Strong hands-on experience with GCP/Azure/AWS and a deep understanding of its services and tools.
  • Ability to design, develop, and deploy big data solutions in a GCP/Azure/AWS environment.
  • Experience with migrating data systems to GCP/Azure/AWS .
  • Excellent problem-solving skills and the ability to work independently or as part of a team.
  • Strong communication skills to effectively collaborate with team members and stakeholders.

Responsibilities:

  • Development: Design and develop scalable big data solutions using Hadoop, Spark, Hive, and GCP/Azure/AWS services.
  • Design: Architect and implement big data pipelines and workflows optimized for GCP/Azure/AWS, ensuring efficiency, security, and reliability.
  • Deployment: Deploy big data solutions on GCP/Azure/AWS, leveraging the best practices for cloud-based environments.
  • Migration: Lead the migration of existing data systems to GCP/Azure/AWS, ensuring a smooth transition with minimal disruption and optimal performance.
  • Collaboration: Work closely with cross-functional teams to integrate big data solutions with other cloud-based services and business goals.
  • Optimization: Continuously optimize big data solutions on GCP/Azure/AWS to improve performance, scalability, and cost-efficiency.

Mandatory Skills

Hadoop, Spark, Hive, GCP/Azure/AWS

Other Smiliar Jobs
 
  • Phoenix, AZ
  • 9 Hours ago
  • Phoenix, AZ
  • 9 Hours ago
  • Henrico, VA
  • 9 Hours ago
  • Los Angeles, CA
  • 9 Hours ago
  • Chicago, IL
  • 9 Hours ago
  • Los Angeles, CA
  • 9 Hours ago
  • Houston, TX
  • 9 Hours ago
  • Fontana, CA
  • 9 Hours ago
  • Fremont, CA
  • 9 Hours ago
  • San Jose, CA
  • 9 Hours ago
  • New York, NY
  • 9 Hours ago
  • Los Angeles, CA
  • 9 Hours ago