India is among the top ten priority markets for General Mills, and hosts our Global Shared Services Centre. This is the Global Shared Services arm of General Mills Inc.
which supports its operations worldwide. With over 1,300 employees in Mumbai, the center has capabilities in the areas of Supply Chain, Finance, HR, Digital and Technology, Sales Capabilities, Consumer Insights, ITQ (R&D & Quality), and Enterprise Business Services.
Learning and capacity-building is a key ingredient of our success. Job Overview We are looking for a data consultant responsible for development projects, enhancements, and support for Trade product team.
The Trade product team is responsible for developing and maintaining trade applications along with providing data solutions to integrate & transform business data into GCP / Data Lake to deliver data layer for the Sales using cutting edge technologies like GCP / Big Data - Hadoop.
You will be responsible for developing / leading custom GCP / data lake solutions for advanced business intelligence and data mining.
Job Responsibilities Core Responsibilities : 70% of time Design, create, code, and support a variety of GCP, Hadoop, ETL & SQL solutions Experience with agile techniques or methods Learning Trade system, developing knowledge of system data and learning business rules / logic.
Creating and supporting scheduled jobs in Tidal, maintaining Oracle packages, procedures, and triggers Work effectively in a distributed global team environment.
Effective technical & business communication with good influencing skills Analyze existing processes and user development requirements to ensure maximum efficiency Ability to manage multiple stakeholders, tasks and navigate through ambiguity & complexity Turn information into insight by consulting with architects, solution managers, and analysts to understand the business needs & deliver solutions Maintain strong technical skills and share knowledge with team members 20% of time Support existing Data assets & related jobs.
Should have job Scheduling experience (Tidal, Airflow, Linux) Able to lead small projects / initiatives & contribute / lead effectively to the implementation of projects.
10% of time Proactive research into up-to-date technology or techniques for development Should have automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes.
Train and educate internal team, IT functions, and business users Familiarity with real time and streaming data processes Desired Profile Education : Minimum Degree Requirements : Bachelors Preferred Degree Requirements : Masters Preferred Major Area of Study : Engineering Experience : Minimum years of Hadoop experience required : 5-8 years Preferred years of Data Lake / Data warehouse experience : 7-12+ years Specific Job Experience or Skills Needed : Skills Level : Beginner >
Advance HDFS, Map reduce Expert Hive, Impala & Kudu Expert Python Intermediate SQL, PLSQL Expert Data Warehousing Concepts Expert Big Data Modelling Intermediate Scala Intermediate ETL Tools Expert Data Governance tool(any) Beginner Enterprise Hadoop Architecture Beginner Cloud - GCP Beginner Other Competencies : - Starts on the journey of identifying himself as technical go-to person - Pro-actively identifies potential issues / deadline slippage / opportunities in projects / tasks & takes timely decisions - Demonstrates strong affinity towards paying attention to details and delivery accuracy - Self-motivated team player and should have ability to overcome challenges and achieve desired results COMPANY OVERVIEW We exist to make food the world loves.
But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day.
We look for people who want to bring their best bold thinkers with big hearts who challenge one other and grow together.
Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next.