Basic Function :
The Senior Big Data Developer works closely with the Big Data Architect to translate the customer's business requirements into a Big Data solution.
This includes understanding the customer data requirements, design of the application and interfaces, and development, testing, and deployment of the proposed solution.
Has the ability to design large-scale data processing systems and help identify the best practices in design and development.
The Senior Big Data Developer is also understands the complexity of data and can design systems and models to handle different variety of data with varying levels of volume, velocity and veracity.
Should have independently worked on design and data ingestion concepts in a consultative mode.
Essential Functions :
Has a deep understanding and experience with several of the following : Business Analysis, Requirements Gathering, Data Analysis, Data Modeling
Advanced knowledge of design and development methodologies.
Demonstrated work ethic, focus and self-discipline
Has and maintains a deep understanding of the role of big data in business and the enterprise.
Propose recommended and / or best practices regarding the movement, manipulation, and storage of data in a big data solution including data ingestion, data storage options, query techniques, data variety, volume & velocity,
Research and experiment with emerging technologies and tools related to big data.
Experience in scaling applications on big data platforms to massive size.
Long term development and Technical expertise in DW / BI Practice, communicate well with all stakeholders, optimize objectives, leverage state of the art tools and best practices, integrate into corporate systems and deliver on time.
Deep understanding in Data Warehousing, Dimensional Modelling, Star & Snow-flake schema design, Reference DW Architectures, ETL Architect, ETL (Extract / Transform / Load), Data Analysis, Data Conversion / Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support etc.
Should have independently worked on proposing design and data ingestion concepts.
Primary Internal Interactions :
Project Delivery :
Leads big data development teams in documenting and implementing technical solutions for business problems
Works with Big Data Architect to estimate big data design and development work.
Works with Big Data Architect ensuring appropriate timely delivery of artifacts.
Account & Resource Management - NA
Primary External Interactions :
Assessment Participation & Leadership - NA
Must Have Skills :
Hadoop stack including HDFS cluster, MapReduce, Hive, Spark and Impala
Web Technologies CSS, DHTML, XML, Hight Charts, Linux
ETL tools such as Informatica, Talend and / or Pentaho.
Query : SQL, No SQL Concepts
Ingest : Kafka, Sqoop, Flume
Orchestration : Zookeeper
Databases : Postgres, Mongo DB, Cassandra, HBase
Languages : Java, Scala
Good to have Skills :
Core : AWS, Hadoop, Yarn
Process : Agile-Scrum, Iterative Development, DevOps, CI
Analytics : Descriptive, Predictive (Added advantage)
Tools : Jenkins and TFS
Languages : Python, Java Enterprise
ref : hirist.com)