The Advanced Analytics group has implemented and is operating a new advanced Big Data analytics platform that enables new self-service analytics, decision engineering support, machine learning, modeling, forecasting, and optimizations. Responsible for creating and maintaining scalable, reliable, consistent and repeatable platforms, systems, and models that support data, data products, and analytical products for advanced analytics by gathering, processing, exploring, and modeling raw and diverse data at scale. It requires lifecycle management of multiple data sources, data discovery, and models. Architect and deliver models, analytics, automation, and self-service, which are major tenets of the analytics platform.
MAJOR DUTIES AND RESPONSIBILITIES
Actively and consistently supports all efforts to simplify and enhance the customer experience.
Frame and model meaningful business and engineering scenarios that impact critical business and engineering processes, architectures, and/or decisions.
Make strategic recommendations and implementations for data architectures, data collection processes, and analytics platform integration.
Research and develop machine learning models and algorithms for data analysis and discovery.
Develop innovative and effective approaches to solve analytics problems and communicates results and methodologies.
Leverage multiple data sources to produce data products that solve the needs of business and engineering for business intelligence, operational analytics, descriptive models, predictive models, diagnostic models, and prescriptive models.
Apply data mining, data discovery, and machine learning techniques to large structured and unstructured datasets for exploratory data analysis and data product creation.
Create data, operational, and analytics architectures that enable data engineers to maintain scalable, reliable, consistent and repeatable systems that support data operations for analytics.
Gather and process raw data at scale from any source and in any format.
Profile data to measure quality, integrity, accuracy, and completeness.
Balance workload and operational demands with open source technologies, cloud services, and commercial solutions while optimizing cost and time-to-solution demands.
Architect self-monitoring, robust, scalable interfaces and data pipelines for 24/7 operations
Create highly reusable code modules, templates, and packages that can be leveraged across the data and analytics lifecycle
Increase speed to delivery by architecting and implementing automated solutions across the data and analytics lifecycle
Mentor, educate, and provide senior leadership to data analysts, data engineers, and business intelligence analysts
Skills/Abilities and Knowledge
* Ability to read, write, speak and understand English.
* DBA, user, and programming with SQL-based, NoSQL-based, and columnar database technologies.
* Visualization or BI tools, such as Tableau, Zoomdata, Microstrategy, RapidMiner, or anything Microsoft Power BI.
* Creating proof of concept experiments for analytics, machine learning, or visualization tools that includes hypothesis, test plans, and outcome analysis.
* Experience receiving, converting, and cleansing big data.
* Solid statistical knowledge and techniques.
* Program, product, or project management experience delivering analytics results.
* Strong background in Linux/Unix/CentOS and Windows installation and administration.
* Ability to identify and resolve end-to-end performance, network, server, cloud, and platform issues.
* Excellent pattern recognition and predictive modeling skills.
* Experience with Hadoop, Spark, and/or Snowflake.
* Keen attention to detail with the ability to effectively prioritize and execute multiple tasks.
* Master's degree in a data science, engineering discipline, computer science, statistics, applied math, or related field preferred.
Related Work Experience
* 7 years - User and system admin ownership with Linux/Unix/CentOS and Windows.
* 7 years - Hands-on working experience with RDBMS, SQL, scripting, and coding.
* 5+ years - Developing and deploying machine learning and analytics models.
* Experience delivering one major system where candidate was responsible for designing the architecture, implementing, operating, supporting, and managing the release lifecycle of releases to end of life.
Skills/Abilities and Knowledge
* Familiarity with data workflow/data prep platforms, such as Alteryx, Pentaho, or KNIME.
* Familiarity with automation/configuration management using either Puppet, Chef or an equivalent.
* Knowledge of best practices and IT operations in an always-up, always-available service.
Charter Technical Engineering Center
Highly collaborative and innovative work space