We are looking for a talented Data Engineer for an exciting opportunity on the data engineering team. You would be involved with designing workflows for data and analytics tools that are a big part of the road-map for 2019 and managing data and infrastructure to efficiently query data in the billions. Candidates considered based on their ability to design large distributed technical solutions, manage and optimize data pipeline projects resulting in actionable data and data pipelines which support the larger organization.
This position can be based in New York City, Chicago or Boston.
* Architect, Design and Maintain Data Pipelines through the lifecycle of the product * Optimize and Monitor existing data pipelines using AWS infrastructure. * Write Python/Scala applications for data processing and job scheduling * Understand and Manage massive data-stores. * Integrate products from data pipelines into APIs built in Ruby/Rails * Expose large data sets * Enjoy being challenged and solve complex problems on a daily basis. * Design efficient and robust ETL workflows. * Manage real time streaming application and data flow. * Investigate, procure and ramp up to new technologies. * Work in teams and collaborate with others to clarify requirements. * Build analytics tools that utilize the data pipelines to provide meaningful insights into data.
Let your dream job find you.
Sign up to start matching with top companies. It’s fast and free.