BigData Engineer
Luxoft
Bangalore, IN
4d ago

Project Description

Luxoft is looking for experienced and enthusiastic BigData engineers to be part of our growing team in India. You will have the opportunity to work with global Banking and Capital Market clientele of ours on change and transformation projects.

You will be part of high caliber project teams comprising of engineers with deep technical expertise and domain experience.

Luxoft offers competitive compensation & benefits package for motivated and deserving candidates. The insurance benefits of Luxoft India are amongst the best in industry.

As you progress through your career with Luxoft India, you will also have the opportunity to apply for roles in overseas locations of Luxoft through our flagship Internal Mobility (IM) program

Responsibilities

1. Conceptualize and Implement the data pipeline based on the business requirement

2. Able to fine-tune the existing Spark / Hive program

3. Able to write test cases and enable CI / CD

4. Able to identify and advice client to improve the existing program

5. Create and Maintain Data Modeling / Data Dictionary

6. Able to write API to expose / consume the data if its required

7. Able to orchestrate the data pipeline and Integrate the ML / AI Models in data pipeline

Must have

3-6 years of relevant experience

Basic knowledge of Banking and Financial Services domain

Tools & Technologies

a. Apache Spark or PySpark, HDFS / Hadoop

b. Language Python / Java / Scala & Hive Scripting

c. NoSQL Database Cassandra / MangoDB / HBase / Others, SQL Database Oracle / Postgress / Others, MPP Database TeraData / GreenPlum / Others & In-Memory Database GemFire / Redis / Others

d. Messaging Framework Apache Kafka / RabbitMQ / Pulsar / Others

e. Workflow Oozie / AirFlow / Crontab / Control-M

f. Apache Flume / SQOOP / Nifi

g. Bash scripting

Mandatory experience in one or more of following :

a. Experience in implementing Big Data projects. Worked atleast in two big data projects.

b. Developing reliable, autonomous and scalable data pipelines

c. Experience or knowledge in Building Data Quality Framework

d. Extensive experience in building data pipeline to stream the data from various DataSources to DataLake or DataMart

e. Good Understanding of Datawarehouse concepts

f. Experience in building Data API to allow the downstream to consume the insights

g. Experience in working in Cloudera Stacks

h. Experience working in agile environments

Nice to have

1. Knowledge or Experience in AWS / Azure

2. Knowledge or Experience in Visualization Tool(Tableau / QlikSense / PowerBI)

3. Knowledge or Experience in ML / AI

4. Experience in MicroService Architecture

Languages

English : C1 Advanced

Regular

Report this job
checkmark

Thank you for reporting this job!

Your feedback will help us improve the quality of our services.

Apply
My Email
By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
Continue
Application form