Principal Software Engineer (Data)

Naperville, IL | Full Time | On-site


Egen is a fast-growing and entrepreneurial company with a data-first mindset. We bring together the best engineering talent working with the most advanced technology platforms, including Google Cloud and Salesforce, to help clients drive action and impact through data and insights. We are committed to being a place where the best people choose to work so they can apply their engineering and technology expertise to envision what is next for how data and platforms can change the world for the better. We are dedicated to learning, thrive on solving tough problems, and continually innovate to achieve fast, effective results.

As a Principal Software Engineer (Data), you will lead a team of data engineers in the design and creation of scalable data platforms and fault-tolerant pipelines for modern analytics and AI services. You will be a key voice to leverage your subject matter expertise in modernizing organizations from legacy disparate data warehouse environments (Informatica and Hadoop) to modern distributed data environments (AWS Redshift or Snowflake).


  • Lead and develop passionate data engineering teams through complex data migrations from disparate legacy ETL platforms to modern and highly scalable distributed data platforms.
  • Design and develop distributed ETL/ELT pipelines with cloud-native data stores. AWS, Redshift and Snowflake preferred.
  • Facilitate fast and efficient data migrations through a deep understanding of design, mapping, implementation, management, and support of distributed data pipelines.
  • Prepare data mapping, data flow, production support, and pipeline documentation for all projects.
  • Create and document an end-to-end approach on data warehouse and data mart implementation plans.
  • Consult our clients and business, product, and data science teams to understand end-user requirements or analytics needs to implement the most appropriate data platform technology and scalable data engineering practices.

What we’re looking for:

  • Minimum of bachelor’s degree or its equivalent in Computer Science, Computer Information Systems, Information Technology and Management, Electrical Engineering or a related field.
  • You are a subject matter expert of standard concepts, best practices, and procedures within an enterprise data warehousing environment 
  • You have a strong background in distributed data warehousing with AWS Redshift (Snowflake, Big Query, and/or Azure Data Warehouse).
  • You have expert programming/scripting knowledge in building and managing ETL pipelines using SQL, Python, and Bash.

Great to have:

  • Elasticsearch
  • Kafka
  • Any technical Professional Certification from AWS, Azure or GCP
Scroll to Top