3-9 years of overall experience
3 years of technical solutions experience including 3+ years in a combination of relevant Big Data / Analytics areas including Hadoop and other industry Big Data frameworks, underlying infrastructure for Big Data solutions (clustered / distributed computing, storage, Data center networking)
Implemented at least 2 Big Data projects from start till roll out
Deployment of a large distributed Big Data application
Experience designing, building and operating cloud services in IT, System integrator or service provider
High level of experience with Java / Python / Scala
Ability to demonstrate micro / macro designing and familiar with unix commands and basic work experience in unix shell scripting
Good exposure and experience in cluster administration and Enterprise Big Data landscape preparation from scratch
Technology expertise of solutioning in Hadoop, Hive, Spark / PySpark, SQL, Oozie
Good to have AWS Cloud experience
Professional Skills :
Solid written, verbal, and presentation communication skills
Strong team and individual player
Maintains composure during all types of situations and is collaborative by nature
High standards of professionalism, consistently producing high-quality results
Self-sufficient, independent requiring very little supervision or intervention
Demonstrate flexibility and openness to bring creative solutions to address issues