Fusion HCR is hiring! Direct Hire - Full Stack Data Engineer.
Fusion HCR is seeking experienced Full Stack Data Engineers to join our clients team. This role involves designing, implementing, and deploying robust data solutions. Candidates will be instrumental in spinning up new systems and implementing technologies to enhance our capabilities. This includes data orchestration, warehousing, and visualization tools, focusing on end-to-end solutions.
Key Responsibilities:
- Design and implement scalable real-time or batch data pipelines for large datasets.
- Lead the integration and deployment of data solutions, including Databricks, Azure Data Factory, and related cloud technologies.
- Establish CI/CD pipelines using tools like Git, Git Actions, Jenkins, and Terraform.
- Support AI-driven data warehousing functions to enhance data scientist operations.
- Collaborate with cross-functional teams to maximize out-of-the-box solutions and APIs.
- Transition away from legacy tools like Talend and Click to streamlined, in-house systems.
Qualifications:
- Bachelor's degree in a related field; Master's preferred.
- 5+ years as a data engineer with full-stack capabilities.
- Extensive hands-on experience with cloud platforms (Azure preferred, AWS/Google Cloud a plus).
- Proficiency in Python, SQL, and experience with Databricks and Spark.
- Strong background in data visualization tools and DevOps best practices.
- Experience with implementing and optimizing CI/CD pipelines.
- Familiarity with data governance tools such as Collibra.
- Demonstrated ability to spin up and deploy systems from scratch.
- Advanced understanding of security best practices and cloud infrastructure.
- Hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
- Extensive experience in Big Data technologies such as Apache Spark and streaming technologies such as Kafka, EventHub, etc.
- Extensive experience in designing data applications in a cloud environment.
- Intermediate experience in RESTful APIs, messaging systems, and AWS or Microsoft Azure.
- Extensive experience in Data Architecture and data modeling