Enterprise Data Architect
Incedo Inc
7d ago

Job Description :

  • We have a strong preference for someone with Amazon Web Services (AWS) services.
  • Strengthen the Data engineering team with Big Data solutions running on AWS / Azure / GCP
  • Designing and planning ETL processes
  • Lead discussion of latest trends in Big Data technology and how it applies using cloud / on-premises / hybrid
  • Ability to write quality technical documentation.
  • Experience in data warehouse design and best practices.
  • Establish DevOps processes for marshalling big data work products from development to production
  • Feature development that ties into customer needs and product capabilities
  • Analyzing and recommending new product capabilities on cloud environment
  • Strong organizational skills, with the ability to work autonomously as well as in a team based environment.
  • Collaborate with business, technology and architecture teams across the company to design and develop our next generation data platforms using traditional DB technologies, open source frameworks and cloud services.
  • Exposure to various ETL and Business Intelligence tools
  • Must be very strong in writing SQL queries
  • Primary Skills

  • Master is Computer or equivalent
  • Twelve+ years of experience in delivering technology solutions
  • Three+ years of experience in leading design of Big Data solutions
  • Five+ years of experience with Hadoop, including HDFS, Spark, Hive, HBase
  • Five+ years of experience with NoSQL such as MongoDB, Elastic Search, Solr
  • Five+ years of working experience with Unix systems
  • Expertise with data lakes, data warehouse, data marts
  • Experience with Data Governance, Master Data Management
  • Experience with AWS Services like EMR, Kinesis, S3, Cloudformation, Data Pipeline, Glue
  • Data Warehouse experience with Apache Kylin, Apache Nifi, Apache Airflow, and Kylo
  • Three years of experience designing and implementing Streaming and Batch solutions on Hadoop
  • Experience supporting Data Scientists with machine learning code in Spark
  • Ability to translate product objectives into an execution and delivery plan with milestones and resource needs
  • Strong communication and presentation skills
  • Experience with git and other source control systems
  • Solid grounding in Agile methodologies
  • Secondary Skills

  • Certification in Hadoop / Big Data Hortonworks / Cloudera / MapR
  • A strong delivery background across the delivery of high-value, business-facing technical projects in major organizations
  • Experience with IT Automation tools like Ansible, Chef and Puppet
  • Experience on Hortonworks Data Platform(HDP) / Cloudera / MapR
  • Strong client relationship management skills to identify and close suitable business development opportunities
  • Exceptional interpersonal skills - including presentation skills
  • Experience of managing client delivery teams, ideally coming from a Data Engineering / Data Science environment.
  • Apply
    Add to favorites
    Remove from favorites
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form