Data Engineer
3 days Phoenix,
As a Data Engineer at our client, you will play a crucial role in designing, building, and maintaining the data architecture that powers our analytical and operational capabilities. You will work closely with developers, QA's, Scrum Master and other stakeholders to ensure data availability, reliability, and integrity. Your expertise will be key in transforming raw data into actionable insights that drive our business forward.
Key Responsibilities:
Design and Develop Data Pipelines:
Create and maintain scalable ETL (Extract, Transform, Load) processes to gather, process, and store data from various sources.
Optimize data pipelines for performance and reliability.
Database Management:
Design, implement, and manage robust, scalable, and efficient databases.
Ensure data security, availability, and performance.
Data Integration:
Integrate data from multiple sources, ensuring consistency and accuracy.
Work with APIs and other data integration tools.
Data Quality and Governance:
Implement data quality checks and ensure data governance practices are followed.
Monitor data quality and integrity, addressing any issues promptly.
Collaboration:
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet their needs.
Work closely with DevOps and IT teams to ensure seamless deployment and operation of data solutions.
Documentation and Best Practices:
Document data processes, workflows, and architectures.
Promote best practices in data engineering and contribute to the continuous improvement of the data infrastructure.
Qualifications
Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field.
Proven experience as a Data Engineer or in a similar role.
Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL).
Experience with big data technologies (e.g., Hadoop, Spark, Kafka).
Proficiency in programming languages such as Kotlin and Java.
Familiarity with front-end technologies, particularly React.js, for data visualization and UI integration.
Experience with Google Cloud Platform and its data-related services (e.g., BigQuery, Dataflow, Pub/Sub).
Strong understanding of data warehousing concepts and experience with data warehouse solutions (e.g., BigQuery).
Knowledge of data modeling, data architecture, and ETL processes.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills.