BE / B.Tech / MCA / MS-IT / CS / BCA or any other degrees in related fields
AWS cloud - Min 4 years Exp
Hands on experiecne on AWS services like S3, Lambda, EMR, Dynamodb, SNS, Data pipeline, Step function (Optional)
Experience building large-scale data pipelines using S3, EMR, Dynamodb, Lambda, Spark
Experience in working on Hadoop ecosystem, good understanding of core concepts
Strong technical knowledge on Spark and building Spark pipelines in Python / Scala using (RDD OR, Spark SQL, Dataframe)
Understanding on Data Lake and warehouse concepts.
Understanding on various data architecture approaches and develop end-to-end data solutions