Job Directory Principal Cloud Data Platform Engineer

Principal Cloud Data Platform Engineer
Chicago, IL

Companies like
are looking for tech talent like you.

On Hired, employers apply to you with up-front salaries.
Sign up to start matching for free.


Job Description

At Discover, be part of a culture where diversity, teamwork and collaboration reign. Join a company that is just as employee-focused as it is on its customers and is consistently awarded for both. We're all about people, and our employees are why Discover is a great place to work. Be the reason we help millions of consumers build a brighter financial future and achieve yours along the way with a rewarding career.


We are looking for a seasoned Cloud Data Platform Engineer with Cloud Platform experience to join our quickly growing technology team. Central to our model and overall organizational success, we develop capabilities to be utilized across the analytics enterprise, create products and solutions to drive business decisions, enable the execution of business strategies, and measure business and clinical outcomes. While this role will be dynamic in nature, specific responsibilities of the Cloud Data Platform Engineer will include:

Duties to include:

* Act as a lead engineer on one of the core teams responsible for enabling new features and optimizing data platforms that support Discover's Advanced Analytic user community
* Engineer leading-edge analytics environments including software, data platforms and user tools that enable productivity and application integration in a secure fashion
* Adapt to new technologies and approaches for the product as we move toward cloud-based deployments leveraging containers on Kubernetes.
* Understand how micro-services need to be architected to run within containerized platforms
* Architect and design batch and streaming data platforms to support Analytic use cases
* Participate in the development of internal software products
* Embody and exemplify our "devops" culture
* Promote a risk-aware culture, ensure efficient and effective risk and compliance management practices by adhering to require standards and processes.

Responsible for using experience and hands-on approach with next-generation technologies to contribute to the team that delivers the latest data-driven platforms and next-generation analytic technologies.


* Builds and leads a high-performing, Agile team focused on next-generation data and analytic technologies.
* Provides senior-level technical consulting to create and enhance analytic platforms and tools that enable state of the art, next-generation Big Data capabilities to analytic users and applications.
* Provides senior-level technical consulting to application development teams during application design and development for highly complex and critical data projects.
* Be part of teams delivering all data projects, including migration to new data technologies for unstructured, streaming, and high-volume data.
* Develops and deploys distributed computing Big Data applications using Open Source frameworks.
* Utilizes programming languages like Java, Python, and NoSQL databases like Cassandra.
* Utilizes Hadoop modules such as YARN & MapReduce, and related Apache projects such as Hive, Hbase, Pig.
* Leverages DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation, and Test Driven Development to enable the rapid delivery of end-user capabilities.
* Builds and administers large scale data platforms such as Hadoop Data Lake, integrating next-gen data analytic tools into the Big Data ecosystem. Expertise in real-time data technologies.

Minimum Qualifications

At a minimum, here's what we need from you:

* Bachelor's Degree in Computer Science or related field
* 4+ years of experience in Data Platform Administration, Engineering, or related field
* In lieu of a degree, 6+ years of experience in Data Platform Administration, Engineering, or related field

Preferred Qualifications

If we had our say, we'd also look for:

* Experience in at least 1 language / scripting framework (Bash, Python, Java, etc.) and desire to pick up another language
* 5+ years in Linux/Unix including basic commands, shell scripting and solution engineering
* 2+ years Cloud Platform Experience, preferably AWS
* 2+ years Container Orchestration platforms (i.e. Kubernetes, Openshift, Docker, etc..)
* Hands on experience with cloud platforms and technologies such as S3, DynamoDB, Kinesis, EC2, Spark, Lambda and other Cloud Data Technologies
* Experience working with automated build and continuous integration systems (Chef, Jenkins, Docker)
* Industry experience in Financial Services or other regulated industries


Discover Financial Services is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, protected veteran status, among other things, or as a qualified individual with a disability.

Let your dream job find you.

Sign up to start matching with top companies. It’s fast and free.