Position: Data Engineer (ETL, Azure, DATABRICKS)
Location: Warren, MI
Duration: Open ended
Shift: 8AM – 5PM EST
Hybrid position
Job Description: As a Data Engineer, you will build industrialized data assets and data pipelines in support of Business Intelligence and Advance Analytic objectives. The Data Engineer handles leading and delivering new and innovative data driven solutions that are elegant and professional. You will work closely with our forward-thinking Data Scientists, BI developers, System Architects, and Data Architects to deliver value to our vision for the future. Our team focuses on writing maintainable tests and code that meet the customer’s need and scale without rework. Our engineers and architects work in highly collaborative environments across many disciplines (user experience, database, streaming technology, custom rules engines, and ai/ml and most web technologies). We work on innovative technologies – understanding and inventing modern designs and integration patterns along the way. Our Data Engineers must –
· You will assemble large, complex data sets that meet functional / non-functional business requirements.
· You will identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
· Lead and deliver exceptional data driven solutions across many different languages, tools, and technology.
· Develop a culture which takes challenging and complex ideas and turns them into production solutions.
· Have a broad, enterprise-wide view of business and varying degrees of appreciating for strategy, process, capabilities, enablers, and governance.
· Be able to critically apply themselves to solving problems in an interconnected and integrated environment.
· Think strategically about adopting innovative technologies that are beyond the horizon and deliver on sound solution design.
· Create high-level models that can be leveraged in future analysis to extend and mature the business architecture.
· Working towards continuously raising our standards of engineering excellence in quality, efficiency, and developing repeatable designs.
· Performing hands-on development, leading code reviews and testing, creating automation tools, and creating proofs of concepts
· Ability to work along with operations team during any production issues related to platform.
· Best practices like agile methodologies, design thinking and continuous deployment that will allow you to innovate fast.
· Deliver solutions across Big Data applications to support business strategies and deliver business value.
· Build tool / automation to make deployment and monitoring production environment more repeatable.
Candidate Req:
· Bachelor Degree in the data space or a related field (computer science, information technology, mathematics are good fields)
· Years of experience – 5-7 years’ experience building ETL (ETL – extract transform and load)
· ETL – code utilizing different technologies, connects you to a database and helps extract it out so the table and the fields do the transformation cleaning, connect to another database and actually load it to the new database
· Working with azure cloud technologies
· Understanding databases, database structures, data model, data architecture
· Expert in multiple tools and technologies including Azure and Databricks, Scala, Spark, Python, SQL, .NET, REST API, Angular
· SQL Server, PostgreSQL, SQL Server Integration Services (SSIS), Complex ETL with massive data, PowerBI
Top 3 skills:
· 5-7 years of experience building ETL
· Working with azure cloud technologies
· Understanding databases, database structures, data model, data architecture