Job Directory Solutions Consultant - Public Sector

Solutions Consultant - Public Sector
Washington, DC

Companies like
are looking for tech talent like you.

On Hired, employers apply to you with up-front salaries.
Sign up to start matching for free.

About

Job Description

Job Description:

Cloudera is seeking experienced Solutions Consultant to join our growing Public Sector Professional Services team. This key role has two major responsibilities: first to work directly with our customers and partners to optimize their plans and objectives for designing and deploying Apache Hadoop environments, and, secondly, to assist in building or designing reference configurations to enable our customers and influence our product.

Solutions Consultant will facilitate the communication flow between Cloudera teams and the customer. For these strategically important roles, we are seeking outstanding talent to join our team.

Responsibilities:

* Work directly with prospective customers' technical resources to devise and recommend solutions based on the understood requirements


* Represent Cloudera Professional Services while on site with clients by demonstrating subject matter expertise in the fields of big data and data modernization.


* Analyze complex distributed production deployments, make recommendations, and implement Cloudera solutions to optimize performance


* Work closely with Cloudera's teams at all levels to ensure rapid response to customer questions and project blockers


* Help develop reference Hadoop architectures and configurations


* Drive delivery projects with customers to ensure successful adoption of Cloudera technologies


* Travel according to customer requirements



Qualifications:

* US Citizenship is required.


* More than three years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions


* Experience designing and deploying production large-scale Hadoop solutions


* Experience installing and administering multi-node Hadoop clusters


* Experience designing queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.


* Knowledge of distributed systems, complex data pipelines and ETL, common ETL packages / libraries


* Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment


* Strong understanding of various enterprise security solutions such as LDAP and/or Kerberos


* Strong understanding of network configuration, devices, protocols, speeds and optimizations


* Strong understanding of the Java development, debugging & profiling


* Solid background in Database administration or design


* Familiarity with scripting tools such as bash shell scripts, Python and/or Perl


* Ability to understand and translate customer requirements into technical requirements


* Excellent verbal and written communications


* Bachelor's degree in Computer Science or a related field, or equivalent



Nice, but not required experience:

* TS/SCI, TS, Secret or Public Trust.


* Ability to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based deployments.


* Experience with Cloud Platforms & deployment automation


* DC / MD / VA Residence is required, relocation options are available.


Let your dream job find you.

Sign up to start matching with top companies. It’s fast and free.