Big Data Developer - Hadoop/Spark/Kafka (4-10 yrs) Anywhere in India/Multiple Locations/Bangalore/Hyderabad/Pune (Analytics & Data Science)
Tigi HR Solution
11h ago
source :

Experience : 4+yrs

Work Location : Hyderabad / Pune / Bangalore / Anywhere

Job type : Contractual (6 months+)

Job Description

  • 2+ years of experience in developing in large-scale Big Data applications
  • Hands on experience in using technologies like Hadoop, spark, Kafka, Python, GCP, BigQuery, Data Fusion, Airflow
  • Exposure to Azure Cloud Platform
  • Strong problem-solving skills, data structures and algorithms
  • Experience with distributed systems handling large amounts of data
  • Knowledge of performance and application testing, scheduling tools
  • General understanding of databases (RDBMS, NoSQL)
  • Ability to work effectively under pressure in a dynamic environment
  • Strong verbal and written communication skills
  • Experience working in an Agile / SCRUM model.
  • Experience with testing practices, processes and artifact creation, and user acceptance testing.
  • Good communication skills in written and verbal form
  • Experience testing, automating, and instrumenting your code
  • A degree in computer science or equivalent experience
  • Report this job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form