WoodFlake - Python/Machine Learning Developer - MapReduce/MySQL (1-2 yrs) MP (Systems/Product Software)
WoodFlake Business Solutions and Retail Pvt. Ltd.
Madhya Pradesh, IN
1d ago
source : hirist.com

Key Skills : NodeJS, Python, Python web-frameworks, Web3.js, Ethereum, Hyberledger, Block chain Algorithms, Software Architecture, Machine Learning Models, Data Structures, MySQL, NoSQL, PostGRESQL

Scope Of Responsibility :

  • Responsible for architecture and design of models / projects.
  • As the primary technical authority on Big Data Analytics and is responsible for the overall robustness of architecture and design across functional and non-
  • functional parameters including performance, security, usability, scalability etc.

  • Collaborates in planning initiatives related to Strategy, Future Roadmaps, System Architecture, Development, and Operations.
  • To understand products, technologies and its applications should be able to resolve technical and module related problems.
  • Requires no to minimal direction and supervision.
  • Able to represent the product team externally and / or other external technology body with a level of authority and confidence.
  • Suggests improvements, Identify problems or adapt existing methods and techniques drawing from past experiences and feedback.
  • Primary Skills :

  • Expert in Python, with knowledge of at least one Python web framework such as Django, Flask, etc.
  • Knowledge of Django Rest Framework, Celery, Redis and Swagger a huge plus.
  • Analyses competitive technologies and makes appropriate architectural / design decisions.
  • Develop proofs-of-concept and prototypes to help illustrate approaches to technology and business problems.
  • Creates architecture and technical design documents to communicate solutions that will be implemented by the development team.
  • Building real-time data pipelines using Kafka / Spark / Storm / Flume.
  • Solid understanding of MapReduce / Pig / Hive / Spark.
  • Experience using languages such as Python / R / Scala / Java for data analytics applications.
  • Building consumption frameworks on Hadoop (Restful Services, Self-Service BI and Analytics)
  • Experience with PaaS platforms such as CloudFoundry.
  • Extensive and working experience with machine learning and regression models
  • Familiarity with event-driven programming in Python and with some ORM (Object Relational Mapper) libraries.
  • Secondary Skills (Desirable) :

  • Data integration (batch, micro-batches, real-time data streaming) across Hadoop, RDBMSs, NoSQL (MongoDB / Hbase / Cassandra) and Data Warehousing hosted on premise as well as on cloud platforms (AWS / Azure) (at enterprise scale with at least one end to end implementation.
  • Experience optimizing web applications (CDN, Code compression)
  • Big Plus + : (these will give a good chance of getting hired)

  • Passion for development of test automation tools.
  • Self-starter, independent and with ability to come up with solutions to complex problems.
  • Experience with improving inventing or proving the value of new algorithms that improve a product's capabilities, speed, efficiency and reliability
  • Base Requirements (must have to be considered) :

  • Fulltime Bachelor and / or Master Degree in engineering (B.E. / B.Tech / M.E. / M.Tech)
  • AND RIGHT ATTITUDE.
  • CORE FIELD Experience Required : 1 yrs to 2 yrs

    Apply
    Add to favorites
    Remove from favorites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form