Senior Data Platform Engineer
Iron Mountain
Bangalore, IND
1d ago

At Iron Mountain we protect what our customers value most, from the everyday to the extraordinary, while helping them bridge the physical and digital world.

Our people have the opportunity to bring their creativity to a workplace that thrives on change. Here, you will be part of a team that doesn’t just embrace what’s exceptional.

It creates exceptional.


As Iron Mountain continues its digital transformation, we are growing out our Enterprise Data Platform Team, directly supporting all business intelligence, analytics and data integration solutions at Iron Mountain.

This team is responsible for design, development and implementation of data platform components used to deliver our data solutions.

Our Senior Data Platform Engineer would need advanced knowledge of cloud big data technologies, software development experience, and strong SQL skills.

The ideal candidate for this role has a background in software development and big data engineering that is comfortable in a remote working environment supporting hybrid on-shore / off-shore engineering teams.


Build and operationalize cloud-based platform components

Build production quality ingestion pipelines with automated quality checks to help enable the business to access all of our data sets in one place

Assessing the systems architecture currently in place and working with technical staff to recommend solutions to improve it.

Build automation, using Python modules, to support our product development and data analytics initiatives

Achieve maximum uptime of our platform utilizing cloud technologies such as Kubernetes, Terraform, Docker, etc.

Resolving technical problems as they arise.

Providing guidance to development teams.

Continually researching current and emerging technologies and proposing changes where needed.

Assessing the business impact that certain technical choices have.

Participate in a collaborative, peer review based environment fostering new ideas via cross team guilds / specialty groups

Maintain comprehensive documentation around our processes / decision making


Have experience with DevOps / Automation tools to help minimize operational overhead for our platform

Must be able to contribute to self-organizing teams with minimal supervision working within the Agile / Scrum project methodology

Bachelors Degree in Computer science or related field

3+ years of related information technology experience

1+ years of strong experience building complex ETL pipelines with dependency management utilizing File Watchers, APIs, etc.

Airflow, Talend, etc.

2+ years of strong experience directly related to Big Data technologies

Spark, Hive, Hadoop, Parquet, HDFS, Python, Scala, Data Lake, NoSQL Industry recognized certifications (similar to below) :

GCP Certified Data Engineer

GCP Certified Solutions Architect

AWS Solutions Architect

Spark Certified Developer

Demonstrated experience with the Scrum Agile methodology

Deep familiarity with PaaS services, containers, orchestrations specifically around Docker and Kubernetes.

Strong ability to learn new technologies in a short time.

Must possess well-developed verbal and written communication skills.


Be part of an ever evolving global organization focused on transformation and innovation A support system where you have a safe place to voice your opinion, share feedback, and be your true authentic self

Global connectivity to learn from 26,000+ teammates across 52 countries Be part of a winning team who embrace diversity, inclusion, and our differences Competitive Total Reward offerings to support your career at Iron Mountain, family, personal wellness, financial wellbeing, and retirement

Report this job

Thank you for reporting this job!

Your feedback will help us improve the quality of our services.

My Email
By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
Application form