Lead Data Engineer
Acko General Insurance
HSR Layout, Bengaluru, Karnataka, India
3d ago

JobDescription : Acko is India's first and only all-digital Insurtech product company. Through innovative digital products, customised pricing and use of data and tech, we are changing how insurance works, and is perceived by users in India.

Although we are solving for the Indian market, we are part of a global wave of Insurtech startups that are creating success through technology and business model disruption ZhongAn in China ($11 Bn valuation), Oscar($3 Bn valuation), Lemonade, Metromile in the US are some of the others rejigging this space.

We are a well-funded series-D company backed by a slate of marquee investors including Binny Bansal, Amazon, Ascent capital, Accel, SAIF and Catamaran.

While FY21 will only be our third year of operations, on the back of a steep growth trajectory the past 2 years, we are expecting strong financial growth.

We clocked roughly $20M in premiums(revenue) in our first year of operations and ended FY20 with over $50M in premiums, a growth rate of 150%.

Through partnerships with large internet players such as Amazon, Ola, RedBus, Oyo, Lendingkart, ZestMoney, GOMMT group etc, our micro insurance product has reached 50M unique users.

Our commitment to build a diverse team, innovative products and a collaborative culture has earned us many accolades and awards.

We’re Great Place to Work’ certified and have consistently featured on Linkedin’s list of top startups. A team of 450 and counting, we are growing at an unstoppable pace and would love for you to be a part of this incredible journey.

Role : As a Data Engineer at Acko, you will be working on collecting, storing, processing and analysis of huge datasets.

The data may be available from heterogeneous sources and this needs to be collected either in batch or in real-time.The primary focus will be on choosing optimal and highly scalable solutions to use for these purposes and then maintaining, implementing and monitoring them.

You will be part of a team building the next generation data warehouse platform, implementing new technologies and new practices in existing data pipelines, extending or migrating to new architecture as needed.

You will be responsible to support the rapidly growing and dynamic business demand for data and make them available for the business decisions, mostly consumed by Data Analytics and Data Science team which will have an immediate influence on day-to-day decision making at Acko.

Responsibilities Deliver on the business data requirements Investigate, identify and establish new tools and processes for data warehousing, data quality assurance, reporting, business intelligence, data governance and data cataloging.

Setup reliable data ingestion pipelines for new data sources and integrate them effectively with existing data sets Assemble, aggregate and transform large datasets to meet the requirements of downstream data consumers Build data quality tracking mechanisms and diligently execute the issue address processes Address inevitable disruptions in data ingestion and processing Develop a comprehensive data catalog Develop robust internal documentation Implement data governance strategies as required by the business Address questions and concerns from downstream data consumers Continue to adapt to changes based on emergence of new technologies, new competitors, artificial intelligence and alternative sources of data Minimum Qualifications 5+ years of experience in data engineering and data warehousing with strong working knowledge of multiple SQL and Big Data technologies.

Proficient in at least one programming language (Python, Scala, Java etc). Strong Data Warehouse and data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc Hands on experience on Big Data tools and design and develop data pipeline on Google Cloud Platform Should have experience in building cloud native data platform for batch and streaming workloads preferably using GCP Hands-on experience in Dataflow, Apache Beam, Spark, Hadoop, Kafka, BigQuery, Data Catalogue, Airflow etc Knowledge of reporting tools like Tableau or other BI packages.

Excellent written and verbal communication skills, empathy, initiative and ownership Superb analytical skills, technical aptitude, influencing skills and attention to detail Experience in working with Analytics, Data Science or Reporting teams Eager to learn new things and passionate about technology Preferred Qualifications Bachelors or master’s degree in Computer Science, Engineering, Statistics, Informatics, Information Systems or in another quantitative fields Strong technical skills in best of class technologies around building data platform, data lake, data warehousing, data wrangling, data quality and data governance on Google Cloud Platform Ability to work effectively with a varied set of stakeholders.

Experience with Agile Development Methodologies and Test-Driven Development with strong focus on processes. Acko is an equal opportunity employer.

We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, sexual orientation, gender identity, disability or veteran status. Posted On :

Acko is India's first and only all-digital Insurtech product company. Through innovative digital products, customised pricing and use of data and tech, we are changing how insurance works, and is perceived by users in India.

Although we are solving for the Indian market, we are part of a global wave of Insurtech startups that are creating success through technology and business model disruption ZhongAn in China ($11 Bn valuation), Oscar($3 Bn valuation), Lemonade, Metromile in the US are some of the others rejigging this space.

We are a well-funded series-D company backed by a slate of marquee investors including Binny Bansal, Amazon, Ascent capital, Accel, SAIF and Catamaran.

While FY21 will only be our third year of operations, on the back of a steep growth trajectory the past 2 years, we are expecting strong financial growth.

We clocked roughly $20M in premiums(revenue) in our first year of operations and ended FY20 with over $50M in premiums, a growth rate of 150%.

Through partnerships with large internet players such as Amazon, Ola, RedBus, Oyo, Lendingkart, ZestMoney, GOMMT group etc, our micro insurance product has reached 50M unique users.

Our commitment to build a diverse team, innovative products and a collaborative culture has earned us many accolades and awards.

We’re Great Place to Work’ certified and have consistently featured on Linkedin’s list of top startups. A team of 450 and counting, we are growing at an unstoppable pace and would love for you to be a part of this incredible journey.

Role :

As a Data Engineer at Acko, you will be working on collecting, storing, processing and analysis of huge datasets. The data may be available from heterogeneous sources and this needs to be collected either in batch or in real-time.

The primary focus will be on choosing optimal and highly scalable solutions to use for these purposes and then maintaining, implementing and monitoring them.

You will be part of a team building the next generation data warehouse platform, implementing new technologies and new practices in existing data pipelines, extending or migrating to new architecture as needed.

You will be responsible to support the rapidly growing and dynamic business demand for data and make them available for the business decisions, mostly consumed by Data Analytics and Data Science team which will have an immediate influence on day-to-day decision making at Acko.

Responsibilities

  • Deliver on the business data requirements
  • Investigate, identify and establish new tools and processes for data warehousing, data quality assurance, reporting, business intelligence, data governance and data cataloging.
  • Setup reliable data ingestion pipelines for new data sources and integrate them effectively with existing data sets
  • Assemble, aggregate and transform large datasets to meet the requirements of downstream data consumers
  • Build data quality tracking mechanisms and diligently execute the issue address processes
  • Address inevitable disruptions in data ingestion and processing
  • Develop a comprehensive data catalog
  • Develop robust internal documentation
  • Implement data governance strategies as required by the business
  • Address questions and concerns from downstream data consumers
  • Continue to adapt to changes based on emergence of new technologies, new competitors, artificial intelligence and alternative sources of data
  • Minimum Qualifications

  • 5+ years of experience in data engineering and data warehousing with strong working knowledge of multiple SQL and Big Data technologies.
  • Proficient in at least one programming language (Python, Scala, Java etc).
  • Strong Data Warehouse and data modelling skills with solid knowledge of various industry standards such as dimensional modelling, star schemas etc
  • Hands on experience on Big Data tools and design and develop data pipeline on Google Cloud Platform
  • Should have experience in building cloud native data platform for batch and streaming workloads preferably using GCP
  • Hands-on experience inDataflow, Apache Beam, Spark, Hadoop, Kafka, BigQuery, Data Catalogue, Airflow etc
  • Knowledge of reporting tools like Tableau or other BI packages.
  • Excellent written and verbal communication skills, empathy, initiative and ownership
  • Superb analytical skills, technical aptitude, influencing skills and attention to detail
  • Experience in working with Analytics, Data Science or Reporting teams
  • Eager to learn new things and passionate about technology
  • Preferred Qualifications

  • Bachelors or master’s degree in Computer Science, Engineering, Statistics, Informatics, Information Systems or in another quantitative fields
  • Strong technical skills in best of class technologies around building data platform, data lake, data warehousing, data wrangling, data quality and data governance on Google Cloud Platform
  • Ability to work effectively with a varied set of stakeholders.
  • Experience with Agile Development Methodologies and Test-Driven Development with strong focus on processes.
  • Acko is an equal opportunity employer. We welcome and encourage diversity in the workplace regardless of race, gender, religion, age, sexual orientation, gender identity, disability or veteran status.

    Department : Tech Posted On : Tech Open Positions : 1 Posted On : 31-Dec-1969 Skills Required : Dataflow, Apache Beam, Spark, Big Query, Posted On : Dataflow, Apache Beam, Spark, Big Query,

    Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form