Job Directory Data Engineer

Data Engineer
Mclean, VA

Companies like
are looking for tech talent like you.

On Hired, employers apply to you with up-front salaries.
Sign up to start matching for free.

About

Job Description

McLean 2 (19052), United States of America, McLean, Virginia

At Capital One, we're building a leading information-based technology company. Still founder-led by Chairman and Chief Executive Officer Richard Fairbank, Capital One is on a mission to help our customers succeed by bringing ingenuity, simplicity, and humanity to banking. We measure our efforts by the success our customers enjoy and the advocacy they exhibit. We are succeeding because they are succeeding.

Guided by our shared values, we thrive in an environment where collaboration and openness are valued. We believe that innovation is powered by perspective and that teamwork and respect for each other lead to superior results. We elevate each other and obsess about doing the right thing. Our associates serve with humility and a deep respect for their responsibility in helping our customers achieve their goals and realize their dreams. Together, we are on a quest to change banking for good.

Data Engineer

We are looking for a savvy Data Engineer to join our growing team of analytics experts. You should have experience with traditional and modern technologies such as Apache Spark, NoSQL databases, Python, REST API, relational databases, Snowflake, PostgreSQL, Git, Javascript, Shell scripting, AWS, and Nifi. You should also be proficient in using Tableau and other BI tools.

In this role, you will be responsible for creating, maintaining and optimizing our data delivery and extraction from multiple data sources into our data warehouse. You will also create, enhance, and optimize Tableau visualizations that are used by our business customers and internal leaders.

Basic Qualifications:

* Bachelor's Degree or Military Experience


* At least 2 years of experience using database management tools (SQL)


* At least 1 year of experience creating data visualizations using Tableau


* At least 1 year of experience using relational database systems (Snowflake, PostgreSQL, or MySQL)


* At least 1 year of experience in software development using Python


* At least 1 year of experience building data pipelines and using ETL tools



Preferred Qualifications:

* 2+ years of experience creating Tableau visualizations


* 2+ years of experience with relational database systems including Snowflake, PostgreSQL, or MySQL


* 3+ years of experience with Scala or Python


* 3+ years of experience building data pipelines and using ETL tools


* 3+ years of building applications in a cloud environment


* 3+ years of experience creating automated solutions


* AWS Certification



Responsibilities of the role include:

* Build data pipeline frameworks to automate high-volume and real-time data delivery to our cloud platform


* Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS 'big data' technologies


* Develop and enhance applications using a modern technology stack such as Python, Shell Scripting, Scala, Postgres, Angular JS, React, and Nifi


* Work directly with Product Owners and end-users to develop solutions in a highly collaborative and agile environment


* Create, enhance, and optimize data visualizations in Tableau using data from a variety of sources and creating custom SQL queries


* Assemble large, complex data sets that meet functional / non-functional business requirements.


* Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.


* Leverage advanced working SQL knowledge for working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.


* Build and optimize 'big data' data pipelines, architectures, and data sets.


* Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.


* Use strong analytic skills related to working with unstructured datasets.


* Build processes supporting data transformation, data structures, metadata, dependency and workload management.


* Manipulate, process, and extract value from large disconnected datasets.


* Maintain working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.


* Demonstrate strong project management and organizational skills.


* Support and work with cross-functional teams in a dynamic environment.



At this time, Capital One will not sponsor a new applicant for employment authorization for this position.

Let your dream job find you.

Sign up to start matching with top companies. It’s fast and free.