Big Data Admin/Data Services
HSBC Group
Hyderabad, Andhra Pradesh, India, Asia Pacific
31d ago

Designation : - Consultant Specialist Big Data Location HSBC ,Lakshmi Cyber City , Hyderabad Experience Total Experience 5 to 10 years (Minimum 3 years relevant exp in Big Data technologies)

  • Good experience in administration of Big Data platform and eco system tools. Big data platform software from Hortonworks, Cloudera, MapR
  • Has experience working on secured environments using a variety of technologies like Kerberos, Knox, Ranger, KMS, Encryption zone, Server SSL certificates
  • Prior experience of Linux system administration is important.
  • Working knowledge of Hortonworks Data flow (HDF) architecture, setup and ongoing administration
  • Good knowledge of Hive as a service, Hbase, Kafka, Spark
  • Knowledge of basic data pipeline tools like Sqoop, File ingestion, Distcp and their optimal usage patterns using enterprise scheduling such as control-m
  • Good experience of Hadoop capacity planning in terms of HDFS file system, Yarn resources
  • Good stakeholder management skills able to engage in formal and casual conversations and driving the right decisions
  • Good troubleshooting skills, able to identify the specific service causing issues, reviewing logs and able to identify problem entries and recommend solution working with product vendor
  • Capable of reviewing and accepting / challenging solutions provided by product vendors for platform optimization and root cause analysis tasks
  • Experience in doing product upgrades of the core big data platform, cluster expansion, setting up High availability for core services
  • Knowledge of the various file formats and compression techniques used within HDFS and ability to recommend right patterns based on application use cases
  • Exposure to Amazon Web services (AWS) and Google cloud platform (GCP) services relevant to big data landscape, their usage patterns and administration
  • Working with application teams and enabling their access to the clusters with the right level of access control and logging using Active directory (AD) and big data tools
  • Setting up disaster recovery solutions for clusters using platform native tools and custom code depending on the requirements
  • Configuring Java heap and allied parameters to ensure all Hadoop services are running at their optimal best
  • Significant experience on Linux shell scripting, Python or perl scripting
  • Experience with industry standard version control tools (Git, GitHub, Subversion) and automated deployment, testing tools (Ansible, Jenkins, Bamboo etc)
  • Worked on projects with Agile / Devops as the product management framework, good understanding of the principles and ability to work as part of the POD teams
  • Working knowledge of open source RDBMS - MySQL, Postgres, Maria DB
  • Ability to go under the hood for Hadoop services (Ambari, Ranger etc) that use DB as the driver
  • Identify project issues, communicate them and assist in their resolution
  • Assist in continuous improvement efforts in enhancing project team methodology and performance
  • Excellent communication, interpersonal and decision making skills
  • Availability to work the
  • Shift pattern required for this role, for example not

    Shift, but out of hours on call work.

    Qualifications

    Apply
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form