The Purpose of This Role
At Fidelity, we use data and analytics to personalize incredible customer experiences and develop solutions that help our customers live the lives they want.
As part of our digital transformation, we have significant investments to create innovative big data capabilities and platforms.
One of them is to build various enterprise data lakes by gathering data across Business Units. We are looking for a hands-on data engineer who can help us design and develop our next generation, cloud enabled data capabilities.
The Value You Deliver
You will be participating in end to end development which includes design, development, testing and deployment.
You will be working closely with Technical Lead / Architects to ensure that solutions are consistent with IT Roadmap.
You will be participating in technical life cycle processes, which include impact analysis, design review, code review, and peer testing.
You will be participating in hands on development of application framework code in Oracle PL-SQL, pySpark, Python, NiFi, Informatica Power Center, along with Control-M and UNIX shell scripts.
You will be troubleshooting and fixing any issues reported on data issues and performance.
You will be presenting the findings and outcome to Senior Leadership teams and provide insights from the data to the business.
You will be helping business teams optimize their current tasks and increase their productivity.
The Skills that are Key to this role
Technical / Behavioral
You must be an expert in using SQL and PLSQL on Oracle or Netezza with UNIX shell scripting skills.
You should be having working knowledge in Hadoop, HDFS, Hive, Spark, NoSQL DBs,
You should have experience of using AWS services like RDS, EC2, S3, EMR and IAM to move data onto cloud platform
Experience / Knowledge on Kubernetes, Containerization and building applications in Containers
Knowledge of Logging, Telemetry and Data Security on AWS / Azure
Understanding of data modeling and Continuous Integration (e.g. Jenkins, GIT, Concourse) tools
Experience of query tuning and optimization in one of the RBMS (oracle or DB2)
You should be having experience in Control-M or similar scheduling tools.
You should have proven analytical and problem-solving skills
You should be strong in Database and Data Warehousing concepts.
You must be able to work independently in a globally distributed environment
You should have clear understanding of the business needs and incorporate these into technical solutions.
The Skills that is good to have for this role
Experience in performance tuning and optimization techniques on SQL (Oracle and Netezza) and Informatica Power Center.
Having strong inter-personal and communication skills including written, verbal, and technology illustrations.
Having adequate knowledge on DevOps, JIRA and Agile practices.
How Your Work Impacts the Organization
Cloud Enablement and Data Model ready for Analytics.
The Expertise we’re looking for
3+ SE / 7+ Lead years of experience in Data Warehousing, Big data, Analytics and Machine Learning
Graduate / Post Graduate
Location : Bangalore , Chennai
Shift timings : 11 : 00 am - 8 : 00pm
Fidelity will reasonably accommodate applicants with disabilities who need adjustments to participate in the application or interview process.
To initiate a request for an accommodation please contact the following :
For roles based in the US : Contact the HR Leave of Absence / Accommodation Team by sending an email to , or by calling , prompt 2, option 2
For roles based in Ireland : Contact
For roles based in Germany : Contact