Solution Architect - ADAS
Mercedez Benz
Bangalore
2d ago

Tasks

Responsibilities- Work directly with stakeholders to define, drive, manage and implement enterprise data strategy and roadmaps that align with business objectives.

  • Architect and build solutions around Data Quality assurance and Data preparation.
  • Provide best practices for Data handling, including the use of multiple data access zones across the Data Lake (raw / landing, operational, self-
  • service, data marts, etc.) and understand how to apply data governance in each.

  • Develop highly scalable and extensible Big Data platforms to ingest, store, model, assure quality standards, and analyze massive data sets from numerous channels and in varying formats.
  • Monitor, manage and tune MapR cluster job performance, capacity planning, and security,
  • Ensure proper configuration management and change controls are implemented during code creation and deployment.
  • This position requires solid attention to detail, deep technical expertise, superb communication and exceptional follow through.
  • Qualifications- Hands-on experience in architecture and implementation of an enterprise Data Lake using a major Hadoop distribution : Cloudera, Hortonworks, MapR, etc.

  • Hands-on prior experience using underlying big data platforms, including HDFS, HBase, Hive, MapR-FS, etc.
  • Hands-on experience with Big Data technologies (batch & real time) in MApR ecosystem (such as Spark, Pig, Hive, Flume, Ozie, Avro, YARN, Kafka, Scala, Storm and Apache Ni-Fi)
  • In-depth knowledge of data warehousing, ETL / ELT, and data modeling best practices
  • Deep understanding of cloud and hybrid-cloud computing infrastructure and platforms.
  • hands on experience in containerization using docker.
  • Deep understanding of building and running Kubernetes and containerized applications on the cluster.
  • Knowledge of or experience with NoSQL databases, such as : - Key-Value Store (Cassandra, Couchbase, Redis, S3)
  • Document-store (CouchDB)
  • Columnar (H-base, Cassandra)
  • Strong analytical and reasoning skills that result in clear, robust, flexible architectures
  • 5-10 years’ experience in the big data field
  • Qualifications

  • Responsibilities- Work directly with stakeholders to define, drive, manage and implement enterprise data strategy and roadmaps that align with business objectives.
  • Architect and build solutions around Data Quality assurance and Data preparation.
  • Provide best practices for Data handling, including the use of multiple data access zones across the Data Lake (raw / landing, operational, self-
  • service, data marts, etc.) and understand how to apply data governance in each.

  • Develop highly scalable and extensible Big Data platforms to ingest, store, model, assure quality standards, and analyze massive data sets from numerous channels and in varying formats.
  • Monitor, manage and tune MapR cluster job performance, capacity planning, and security,
  • Ensure proper configuration management and change controls are implemented during code creation and deployment.
  • This position requires solid attention to detail, deep technical expertise, superb communication and exceptional follow through.
  • Qualifications- Hands-on experience in architecture and implementation of an enterprise Data Lake using a major Hadoop distribution : Cloudera, Hortonworks, MapR, etc.

  • Hands-on prior experience using underlying big data platforms, including HDFS, HBase, Hive, MapR-FS, etc.
  • Hands-on experience with Big Data technologies (batch & real time) in MApR ecosystem (such as Spark, Pig, Hive, Flume, Ozie, Avro, YARN, Kafka, Scala, Storm and Apache Ni-Fi)
  • In-depth knowledge of data warehousing, ETL / ELT, and data modeling best practices
  • Deep understanding of cloud and hybrid-cloud computing infrastructure and platforms.
  • hands on experience in containerization using docker.
  • Deep understanding of building and running Kubernetes and containerized applications on the cluster.
  • Knowledge of or experience with NoSQL databases, such as : - Key-Value Store (Cassandra, Couchbase, Redis, S3)
  • Document-store (CouchDB)
  • Columnar (H-base, Cassandra)
  • Strong analytical and reasoning skills that result in clear, robust, flexible architectures
  • 5-10 years’ experience in the big data field
  • Apply
    Add to favorites
    Remove from favorites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form