Data Engineering Lead – Cloud

Naperville, IL | Full Time | Hybrid

APPLY FOR THIS JOB

Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results.

As a Data Engineer Lead, you will lead a team of insatiably curious data engineers in the design and creation of scalable data platforms and fault-tolerant pipelines for modern analytics and AI services. You will be a key voice to leverage your subject matter expertise in modernizing organizations from legacy disparate data warehouse environments (Informatica and Hadoop) to modern distributed data environments (AWS Redshift or Snowflake).

Responsibilities:

  • Lead and develop passionate data engineering teams through complex data migrations from disparate legacy ETL platforms to modern and highly scalable distributed data platforms.
  • Design and develop distributed ETL/ELT pipelines with cloud-native data stores. AWS Redshift and Snowflake preferred.
  • Facilitate fast and efficient data migrations through a deep understanding of design, mapping, implementation, management, and support of distributed data pipelines.
  • Prepare data mapping, data flow, production support, and pipeline documentation for all projects.
  • Create and document end-to-end data warehouse and data mart implementation plans.
  • Consult business, product, and data science teams to understand end-user requirements or analytics needs to implement the most appropriate data platform technology and scalable data engineering practices.

What we’re looking for:

  • Minimum of Bachelor’s Degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • You have extensive past experience with legacy Informatica and Hadoop based EDW platforms and have led or significantly contributed to a major cloud migration
  • You are a subject matter expert of standard concepts, best practices, and procedures within an enterprise data warehousing environment 
  • You have a strong background in distributed data warehousing with AWS Redshift (Snowflake, Big Query, and/or Azure Data Warehouse would be ok).
  • You have expert programming/scripting knowledge in building and managing ETL pipelines using SQL, Python, and Bash.
Scroll to Top