Softility - Data Delivery Lead - Data Warehousing & Visualization - SQL/Spark/Big Data (8-15 yrs) Hyderabad (Analytics & Data Science)
Softility
Hyderabad, India
4d ago
source : hirist.com

Job Requirements :

  • Bachelor's or Master's degree in Computer Science.
  • Minimum of 8+ years of data experience - any combination of SQL, Datawarehousing, BigData, Streaming, etc.
  • Minimum 4 years of experience with distributed data processing frameworks - Apache Spark, Amazon EMR, Hadoop
  • Solid hands-on working experience in at least 4 or more of the following technologies :
  • c) EMR

    d) Elasticsearch Stack

    k) AWS Kinesis

  • Demonstrable expertise with Python, Spark, and wrangling of various data formats : Parquet, CSV, XML, JSON
  • Experience with batch and stream processing.
  • Experience with building large scale data processing systems.
  • Solid understanding of data design patterns and best practices.
  • Working knowledge of data visualization tools such as Tableau, PowerBI, Apache Superset or Looker is a plus.
  • Experience in analyzing data to identify deliverables, gaps, and inconsistencies in data sets.
  • Experience running production data pipelines, processing frameworks and data architectures.
  • Strong knowledge of ELT / ETL processes, data engineering experience.
  • Knowledge of scripting or programming - Bash, Python is a plus.
  • Knowledge of a configuration management tool, such as Ansible.
  • Deep, hands-on experience with Linux and administration.
  • Comes from a data background, with hands-on experience - for example someone who was in data warehousing the first few years of their career, but in the last 5-6 years have shifted to big data technologies.
  • Very good communication skills and industry knowledge - Clear communication over the phone and emails, able to articulate ideas and is in general in touch with the latest trends and changes in the data landscape.
  • Demonstrated expertise in leading and managing data projects- Distributed computing, Data lakes, Search, NoSQL, Cloud-based data management, Streaming, Buffering, etc.
  • Has led a team of at least 5-10 engineers in dealing with moderate to complex data architectures and streaming use cases.
  • Exhibit ongoing learning - for example clear progression from DW to Hadoop / HDFS to Spark to AWS, etc. is an indication.
  • Experience grooming freshers / trainees / team members and designing and running training plans is a plus.
  • Typically worked at data-centric solution companies or large companies within their data practices.
  • Responsibilities :

  • Proactively communicate to team the latest technological developments affecting our industry and identify opportunities for industry presentations.
  • Provide technical leadership to teammates through coaching and mentorship.
  • Maintain high standards of software quality within the team by establishing good practices and habits.
  • Perform reviews of solution design and code.
  • Design, develop and unit test applications.
  • Interest and enthusiasm for learning new technologies and applying them to solve challenging problems.
  • Comfortable with agile software development practices, continuous integration, and test-driven development.
  • Interest in leading, mentoring, and supporting other developers both by example and through your subject matter expertise.
  • Perform technical aspects of specified deliverables and lead team of technical staff in creating deliverables consistent with scope, schedule and budget.
  • Other duties as assigned.
  • Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form