Data Engineer
Codalyze Technologies
Mumbai, India
3d ago
source : Cutshort

Job Overview : Your mission is to help lead team towards creating solutions that improve the way our business is run.

Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements.

Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally.

Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.Responsibilities and Duties : - As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &unstructured.

Having big data knowledge specially in Spark & Hive is highly preferred.- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data / analytical solutionsEducation level : - Bachelor's degree in Computer Science or equivalentExperience : - Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development- Expertise in application, data and infrastructure architecture disciplines- Expert designing data integrations using ETL and other data integration patterns- Advanced knowledge of architecture, design and business processes Proficiency in : - Modern programming languages like Java, Python, Scala- Big Data technologies Hadoop, Spark, HIVE, Kafka- Writing decently optimized SQL queries- Orchestration and deployment tools like Airflow & Jenkins for CI / CD (Optional)- Responsible for design and development of integration solutions with Hadoop / HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.

  • An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensionalmodeling, and Meta data modeling practices.
  • Experience generating physical data models and the associated DDL from logical data models.- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,and data rationalization artifacts.
  • Experience enforcing data modeling standards and procedures.- Knowledge of web technologies, application programming languages, OLTP / OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
  • Ability to work collaboratively in teams and develop meaningful relationships to achieve common goalsSkills : Must Know : - Core big-data concepts- Spark - PySpark / Scala- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)- Handling of various file formats- Cloud platform - AWS / Azure / GCP- Orchestration tool - Airflow Skills : - Hadoop, Scala, Spark, Amazon Web Services (AWS), Java, Python, Apache Hive and Big Data
  • Report this job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form