For over a century, Neiman Marcus Group has served the unique needs of our discerning customers by staying true to the principles of our founders: to be the premier omni-channel retailer of luxury and fashion merchandise dedicated to providing superior service and a distinctive shopping experience in our stores and on our websites. Neiman Marcus Group is comprised of the Specialty Retail Stores division, which includes Neiman Marcus and Bergdorf Goodman, and our international brand, mytheresa.com. Our portfolio of brands offers the finest luxury and fashion apparel, accessories, jewelry, beauty, and home décor. The Company operates more than 40 Neiman Marcus full-line stores in the most affluent markets across the United States, including U.S. gateway cities that draw an international clientele. In addition, we operate 2 Bergdorf Goodman stores in landmark locations on Fifth Avenue in New York City. We also operate more than 40 Last Call by Neiman Marcus off-price stores that cater to a value oriented, yet fashion minded customer. Our upscale eCommerce and direct-to-consumer division includes NeimanMarcus.com, BergdorfGoodman.com Horchow.com, LastCall.com, and CUSP.com. Every day each of our 15,000 NMG associates works towards the goal of enabling our customer to shop any of our brands "anytime, anywhere, and on any device." Whether the merchandise we sell, the customer service we offer, or our investments in technology, everything we do is to enhance the customer experience across all channels and brands.
Neiman Marcus Group has an immediate opening for a Lead Data Platform Engineer.
The Senior Data Platform Engineer will have the unique combination of business acumen needed to interface directly with key stakeholders to understand business challenges, along with the skills and vision required to translate the need into a world-class technical solution using the latest technologies.
This person will be in a hands-on role as part of a development team responsible for building data engineering solutions for the NMG Enterprise using cloud based data platforms. They will work closely with solution architects and support teams and take a lead on day-to-day development and support for data engineering workloads. In this role, you need to be equally skilled with the whiteboard and the keyboard.
Primary focus of this position is to provide design and development support to the Data Platform team. Day to day activities will include, but not be limited to…ingesting various data sources into the data platform, designing and deploying consolidation and summary tables, design and deployment of data marts, data/process modeling, systematic auditing of load processes, creating batch process automation, and analyzing source data from various applications for purposes of reporting, business intelligence and data science initiatives.
This position is a hands-on, intense role of working side by side with our end user partners to drive better analytics, reporting and most importantly…competitive advantage. This person will be an integral part of the Customer Data platform team, working with architects, developers and data platform engineers to support the overall Neiman Marcus enterprise.
ESSENTIAL DUTIES AND RESPONSIBILITIES (include the following, other duties not listed may be assigned)…
* Work primarily with architects and at times with business partners and data science teams to understand business context and craft best-in-class solutions to their toughest problems
* Provide data modeling, process modeling, and data mart design support.
* Create Python Scripts/SQL scripts in support of data platform load and batch processes.
* Design and deploy consolidation and summary tables as required within the data warehousing environment.
* Perform periodic performance assessments of the automated load processes.
* Be proactive in identifying and working with issues.
* Provide specialized support for our Legacy platforms, as well as the new EDW.
* Documentation of deliverables.
* Standardization of deliverables.
* Create robust and automated pipelines to ingest and process structured and unstructured data from source systems into analytical platforms using batch and streaming mechanisms leveraging a cloud native toolset
* Implements automation to optimize data platform compute and storage resources
* Develops and enhances end to end monitoring capability of cloud Data platforms
* Responsible for implementing custom data applications as required for delivering actionable insights
* Provides regular status updates to all relevant stakeholders
* Participates in daily scrum calls and provides clear visibility to work products
* Participates in developing projects plan, timelines and providing estimates
* Provide hands-on technical assistance in all aspects of data engineering design and implementations including data ingestion, data models, data structures, data storage, data processing, and data monitoring at scale
* Develop data engineering best practices with considerations for high data availability, fault tolerance, computational efficiency, cost, and quality
* 4 year College degree in Computer Science, Information Technology or equivalent demonstrated experience. Masters degree preferred.
* Strong SQL development skills using databases like oracle and Vertica.
* Experience in cloud databases like Snowflake or Redshift is a plus
* Experience with AWS technologies such as EC2, S3 and other basic AWS technologies
* Certification -preferably AWS Certified Big Data or any other cloud data platforms, big data platforms
* 4 years of experience in the data and analytics space
* Solid Programing experience in Python - needs to be an expert in this 4/5 level.
* Experience with workload automation tools such as Airflow, Autosys.
* 4 years experience developing and implementing enterprise-level data solutions utilizing Python , Java, Spark, and Scala, Airflow , Hive
* 3 years in key aspects of software engineering such as parallel data processing, data flows, REST APIs, JSON, XML, and micro service architectures.
* 6 years of RDBMS concepts with Strong Data analysis and SQL experience
* 3 years of Linux OS command line tools and bash scripting proficiency
* Cloud data warehouse experience - Snowflake is a plus
* Must be an EXPERT with ETL Development on Unix Servers.
* Must have demonstratable working knowledge of modern information and delivery practices for on-premises and cloud environments.
* Must have demonstratable experience delivering robust information delivery and management solutions as part of a fast paced data platform program.
* MUST BE AN EXPERT applying business rules/logic using SQL scripts.
* Must have working knowledge of various data modeling techniques (3NF, denormalized, STAR Schema).
* Position requires a self-starter, capable of quickly turning around vaguely defined projects with minimal supervision or assistance.
* Ability to conduct analysis of source data sets to achieve target data set objectives.
* STRONG VERBAL/WRITTEN COMMUNICATION is a MUST. Interacting with business community/users is a core requirement of the role.
* Must be able to provide specialized support for our Legacy platforms, as well as the new Cloud Based Data Platform.
* Candidate MUST possess a STRONG INITIATIVE.
* Candidate MUST be able to run with a project on their own, with as little as a few sentences to begin the project with. Candidates requiring design specs will not be successful.
* Retail experience is a plus.
* Experience in the migration of data from an on premise database to a cloud based data warehouse platform is a strong plus.
* Experience with Qlik, Business Objects, or Tableau is a plus.
* A candidate with experience working with terabyte sized data warehouses and complex ETL mappings that process 50 million records per day is strongly preferred.
NICE TO HAVE
* Kubernetes and Docker experience a plus
* Prior working experience on data science work bench
* Cloud data warehouse experience - Snowflake is a plus
* Data Modeling experience a plus
* Strong, concise and grammatical oral and written communication skills.
* Basic math skills
REASONING/ ANALYTICAL ABILITY
* Complex problem solving skills
* Extensive data analytical skills
* Initiative to develop efficiencies and process improvements