Senior DevOps Engineer, Data Engineering
Pepsico, Inc.
Hyderabad, India
6d ago

Job Description

PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT.

The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCo’s global business scale to enable business insights, advanced analytics and new product development.

PepsiCo’s Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation.

What PepsiCo Data Management and Operations does :

  • Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company
  • Responsible for day-to-day data collection, transportation, maintenance / curation and access to the PepsiCo corporate data asset
  • Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders
  • Increase awareness about available data and democratize access to it across the company
  • As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business.

    You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company.

    As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics.

    You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems and help grow DevOps and DataOps culture.

    Key Accountabilities :

  • Active contributor to code development in projects and services.
  • Collaborate with a cross-functional team of application developers, operations engineers, architects to understand complex product requirements and translate them into automated solutions that you build.
  • Collaborate with colleagues to support and improve architecture, systems, processes, standards and tools.
  • Lead technical discussions to ensure solutions are designed for successful deployment, security, and high availability in the cloud
  • Design, implement, and maintain server, storage, network, and security infrastructure as code.
  • Build reusable pipelines for application deployments.
  • Write and maintain code for automating the creation of scalable / resilient systems / infrastructure with a focus on immutability and containers.
  • Develop, implement, and test automated data backup and recovery, and disaster recovery procedures across multiple regions.
  • Write and maintain clear, concise documentation, runbooks and operational standards including infrastructure diagrams.
  • Assist development teams in the creation and understanding of automated application configurations.
  • Ensure all solutions are properly monitored and instrumented.
  • Troubleshoot and resolve complex issues in development, test and production environments.
  • Design and deploy scalable, highly available, and fault tolerant distributed systems.
  • Continuously identify, adopt, & refine best practices.
  • Qualifications / Requirements

  • Bachelor’s Degree in Cyber Security, Information Technology, Computer Science or related field or related practical experience.
  • 6+ years of experience in Software and / or Infrastructure, with a desired 3+ years in a relevant cloud, Kubernetes, automation development, and / or orchestration positions.
  • 2+ years of hands-on experience on Azure leveraging number PAAS services offered by the platform.
  • Requires excellent problem solving and analytic skills to effectively address the needs of customers, including experience handling problem escalations and notifications.
  • Experience working in GCP, AWS, PCF, Azure, or other cloud-based technologies.
  • Experience with Terraform, Ansible, Salt or similar automation tools are a benefit as we drive towards Infrastructure as Code (IaC).
  • Experience with SCM and DevOps tool suites; examples include Git, Sonar, Jenkins, Artifactory, HashiCorp Packer etc.
  • Experience with containers, docker, Kubernetes, serverless functions.
  • Experience with Linux (RHEL / CentOS) and Windows system administration.
  • Programming / Scripting background with knowledge of Python, PowerShell, Groovy.
  • Hands-on experience with Azure services (Proficiency with Azure DevOps, ARM Templates, Azure Policy, Azure CLI, Azure Rest API).
  • Experience provisioning, operating, monitoring, troubleshooting and maintaining systems running in the cloud.
  • Multi-year experience in application development and configuration automation.
  • Understanding of application, server, and network security.
  • Understanding of immutable infrastructure and infrastructure as code concepts.
  • Working knowledge of Agile / Scrum, experience leading continuous integration and continuous delivery concepts and frameworks.
  • Experience in Firewall and Load Balancing technology; Palo Alto, F5, Citrix Netscaler is a plus.
  • Cloud Certifications (Azure Solutions Architect, DevOps Engineer, or other cloud professional certifications) is a plus.
  • Skills, Abilities, Knowledge :
  • Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management.
  • Proven track record of leading, mentoring data teams.
  • Strong change manager. Comfortable with change, especially that which arises through company growth. Able to lead a team effectively through times of change.
  • Adept at learning and applying new technologies and solving new problems.
  • Ability to understand and translate business requirements into data and technical requirements.
  • High degree of organization and ability to manage multiple, competing projects and priorities simultaneously.
  • Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment.
  • Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs.
  • Foster a team culture of accountability, communication, and self-management.
  • Proactively drives impact and engagement while bringing others along.
  • Consistently attain / exceed individual and team goals
  • Ability to lead others without direct authority in a matrixed environment.
  • Relocation Eligible : Not Eligible for Relocation

    Job Type : Pipeline

    Report this job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form