Job Directory Pwc Tax Innovation - Data Engineer - Senior Associate

Tax Innovation - Data Engineer - Senior Associate Pwc
Dallas, TX

PwC (PricewaterhouseCoopers) is a global provider of auditing, accounting, and assurance services for organizations and individuals.

Companies like Pwc
are looking for tech talent like you.

On Hired, employers apply to you with up-front salaries.
Sign up to start matching for free.

About Pwc

Job Description

A career in our Data Engineering practice, within Innovation services, will provide you with the opportunity to help our clients' tax departments redesign, redefine, and redeploy tax to be a strategic asset across the enterprise. You'll focus on assisting clients incorporate increased automation in the tax reporting process, increase analytic capabilities through data integration, and create solid internal controls that will enable the Tax function to deliver better quality output and contribute more strategically to organisational decision making. Our team focuses on designing and building data models, codification of business rules, mapping data sources to data models, developing data quality solutions, and evaluating technologies to continue to enhance the capabilities of the broader innovation group.


As a Senior Associate, you'll work as part of a team of problem solvers with extensive consulting and industry experience, helping our clients solve their complex business issues from strategy to execution. Specific responsibilities include but are not limited to:

* Proactively assist in the management of several clients, while reporting to Managers and above
* Train and lead staff
* Establish effective working relationships directly with clients
* Contribute to the development of your own and team's technical acumen
* Keep up to date with local and national business and economic issues
* Be actively involved in business development activities to help identify and research opportunities on new/existing clients
* Continue to develop internal relationships and your PwC brand

Job Requirements and Preferences:

Basic Qualifications:

Minimum Degree Required:

Bachelor Degree

Required Fields of Study:

Management Information Systems, Computer and Information Science, Systems Engineering, Electrical Engineering, Chemical Engineering, Industrial Engineering, Mathematics, Statistics, Mathematical Statistics

Minimum Years of Experience:

2 year(s) of experience in Data Engineering or Software Engineering.

Preferred Qualifications:

Preferred Knowledge/Skills:

Demonstrates thorough knowledge and/or a proven record of success in the following areas:

* Python and experience with data extraction, data cleansing and data wrangling;
* SQL and experience with relational databases;
* Codification of business rules (analytics) in one of the programming languages listed above;
* Experience working with business teams to capture and define data models and data flows to enable downstream analytics;
* Data modeling, data mapping, data governance and the processes and technologies commonly used in this space;
* Data integration tools (e.g. Talend, SnapLogic, Informatica) and data warehousing / data lake tools;
* Systems development life cycles such as Agile and Scrum methodologies; and,
* API based data acquisition and management.

Demonstrates thorough abilities and/or a proven record of success in the following areas:

* Object-oriented/object function scripting languages such as Python, R, C/C++, Java, Scala, etc.;
* Relational SQL, distributed SQL and NoSQL databases including, but not limited to, MSSQL, PostgreSQL, MySQL, MemSQL, CrateDB, MongoDB, Cassandra, Neo4j, AllegroGraph, ArangoDB, etc.;
* Big data tools such as Hadoop, Spark, Kafka, etc.;
* Data modeling tools such as ERWin, Enterprise Architect, Visio, etc.;
* Data integration tools such as Talend, Informatica, SnapLogic, etc.;
* Data pipeline and workflow management tools such as Azkaban, Luigi, Airflow, etc.;
* Business Intelligence Tools such as Tableau, PowerBI, Zoomdata, Pentaho, etc.;
* Cloud technologies such as SaaS, IaaS and PaaS within Azure, AWS or Google and the associated data pipeline tools;
* Linux and proven comfort level with bash scripting; and, - Docker and Puppet and agile development processes.

Demonstrates thorough abilities and/or a proven record of success in the following areas:

* Building enterprise data pipelines and the ability to craft code in SQL, Python, and/or R;
* Building batch data pipelines with relational and columnar database engines as well as Hadoop or Spark, and understanding their respective strengths and weaknesses; - Building scalable and performant data models;
* Applying computer science fundamentals such as data structures, algorithms, programming languages, distributed systems, and information retrieval;
* Presenting technical and non-technical information to various audiences;
* Transforming and analyzing large data sets and deriving insights from data using various BI and data analytics tools;
* Thinking differently to solve complex business problems;
* Securely handling data both in motion and at rest such as communication protocols, encryption, authentication, and authorization;
* Working with Graph databases and graph modeling; and,
* Working with the requirements of data science teams.

All qualified applicants will receive consideration for employment at PwC without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran, marital, or citizenship status; or any other status protected by law. PwC is proud to be an affirmative action and equal opportunity employer.

About Pwc

PwC (PricewaterhouseCoopers) is a global provider of auditing, accounting, and assurance services for organizations and individuals.

10001 employees

300 madison avenue

Let your dream job find you.

Sign up to start matching with top companies. It’s fast and free.