Hadoop-Spark_8to10years_Mumbai
Capgemini
Mumbai, MH, IN
8h ago
source : SuccessFactors

Job Responsibilities

Profile : - Hadoop with combination of Spark Developer

Job description : -

  • Involved as an architect in a Hadoop implementation project Analysis Design Development on a Hadoop platform
  • Must have done data transformation using SPARK Scala Must have done data ingestion using Sqoop Data ingestion using other utilities like Kafka Flume are desirable Loading from disparate data sets into HDFS
  • Good knowledge of Hive and another NoSQL database
  • Good knowledge of Java preferable Writing high performance reliable and maintainable code
  • Ability to write MapReduce jobs Good knowledge of database structures theories principles and practices
  • Hands on experience in HiveQL
  • Familiarity with data loading tools like Kafka Sqoop Knowledge of workflow schedulers like Oozie Analytical and problem solving skills applied to
  • Big Data domain Proven understanding with Hadoop HBase Hive and HBase Good aptitude in OOPS multi threading and concurrency concepts Working knowledge on build tool like Maven SBT
  • Experience required : - 8 to 11 years

    Location : - Mumbai

    Apply
    Add to favorites
    Remove from favorites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form