Expertise in Scala Functional programming. Hands-on experience in Scala Development, Spark is must. Adept at Apache Spark programming using Scala. Good knowledge of Configuring and working on Multi node clusters and distributed data processing framework Spark., Experience in designing data pipelines, complex event processing, analytics components using big data technology (Scala/ Spark),Experience in working with large volumes of data (Tera-bytes), analyze the data structures and design in Hadoop cluster effectively.
Let your dream job find you.
Sign up to start matching with top companies. It’s fast and free.