Senior Data Engineer-Digital and Developer Platform (DDP)
VISA
Bengaluru, KA, IN
4d ago

Company Description

As the world's leader in digital payments technology, Visa's mission is to connect the world through the most creative, reliable and secure payment network - enabling individuals, businesses, and economies to thrive.

Our advanced global processing network, VisaNet, provides secure and reliable payments around the world, and is capable of handling more than 65,000 transaction messages a second.

The company's dedication to innovation drives the rapid growth of connected commerce on any device, and fuels the dream of a cashless future for everyone, everywhere.

As the world moves from analog to digital, Visa is applying our brand, products, people, network and scale to reshape the future of commerce.

At Visa, your individuality fits right in. Working here gives you an opportunity to impact the world, invest in your career growth, and be part of an inclusive and diverse workplace.

We are a global team of disruptors, trailblazers, innovators and risk-takers who are helping drive economic growth in even the most remote parts of the world, creatively moving the industry forward, and doing meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers.

You're an Individual. We're the team for you. Together , let's transform the way the world pays.

Job Description and Responsibilities

At Digital and Developer Platform , we are focused on creating smarter payment solutions that provide fast, easier ways to pay with your phone, web and other digital devices wherever you are remote or proximity payment.

The Digital and Developer Platforms Engineering team is looking for talented engineers for Bangalore office to drive the evolution of data and analytics, reporting solutions through developing a scalable platform to support application development teams.

You will work closely with architects, product owners, and other teams to conceive, design, and create unique platform capabilities .

Your code will be rock solid and built to perform on multiple platforms & environments. You will be also work closely with various stakeholders to help establish web development standards.

You will drive development of multiple applications, prototype new applications and feature ideas and explore new technologies that are at the forefront of web development.

We are looking for someone with in depth working knowledge on Apache Hadoop, Hive, Apache Nifi, Kafka, Spark, Python, Apache Airflow of similar scheduling technologies, and have interest in distributed computing, scalability

and building reliable high performance system. The candidate is expected to have strong Big Data design and implementation work experience including the best practices.

KEY RESPONSIBILITIES

  • Design and implement platform components and
  • services to be consumed by the business and application teams for their data, analytics and reporting requirements.

  • Interface with various stakeholders and establish rock
  • solid platform capabilities apart from delivering business requirements.

  • Work as part of scrum team executing the product
  • requirements by working with PMO, Architects, Product Owners and other product vertical stake holders.

  • Aspire to be a subject matter expert in all the platform capabilities and adapt to fast changes.
  • Develop and adapt to common industry best practices for
  • Big data related development.

  • Be part of POC initiatives for visionary initiatives
  • Work independently and mentor new college grads / interns
  • Be part of peer reviews for implementation of new features
  • Communicate status frequently in daily Scrum -
  • Participate in cross-group and internal customer feature demos

    Qualifications

  • Bachelors in Computer Science or other technical field
  • 5+ years of overall software development experience
  • 2+ years of professional experience in developing highly
  • scalable platform development

  • Expertise in Big data Hadoop technologies like Apache Hadoop, Hive, Sqoop / Nifi, Java / Scala, Python Kafka, Spark, Scheduling technologies like Apache Airflow and other relevant distributed technologies.
  • Proficiency in SQL and Hive SQL, UNIX Bash and Shell Script.
  • Strong computer science fundamentals in data
  • structures, algorithms and complexity analysis

  • Passion for learning architectural and design patterns
  • Knowledge about Security standards
  • Experience with tools like Rally, JIRA, Sharepoint, wiki, etc
  • Experience with Agile development methodology
  • Sound problem solving skills
  • Ability and desire to learn new skills and take on new
  • initiatives
  • Excellent verbal and written communication skills
  • Be self motivated
  • Additional Information

    Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form