Data Services LMTS
Hyderabad, India
17h ago

Job Details

Our Tech and Product team is responsible for innovating and maintaining a massive distributed systems engineering platform that ships hundreds of features to production for tens of millions of users across all industries every day.

Our users count on our platform to be highly reliable, lightning fast, supremely secure, and to preserve all of their customizations and integrations every time we ship.

Our platform is deeply customizable to meet the differing demands of our vast user base, creating an exciting environment filled with complex challenges for our hundreds of agile engineering teams every day.

Check out our "We are Salesforce Engineering" video We are Salesforce Engineering Team : The mission of the Unified Intelligence Platform is to make using data at scale easy and enabling, by marrying together cutting-edge technology with an exceptional user experience.

Massive amounts of data are generated each day at Salesforce. Crafting a platform that enables internal users to understand, optimize, and re-envision how the business is and should operate is critical.

Modern distributed systems (e.g. Spark and Presto), coupled with AI tech and custom data applications (e.g. a comprehensive data catalog and a data portal) are a few of the key ingredients this team is weaving together to make this possible.

We are defining the next generation of trusted enterprise computing in the cloud. We're a fast-paced, agile and innovative team.

We're highly collaborative and work across all areas of our technology stack. We enable critical services for the business, qualify complex compute changes, enable big data analytics and trail-blaze new engineering solutions for the cloud.

Who are we looking for? Software engineers and full stack engineers with a passion to build data products for enterprise scale.

Should be willing to play multiple roles as an engineer (Jack of all trades - be the data engineer, backend engineer, ui engineer, dev-ops engineer, support engineer depending on the need).

Have a beginners mind, but also embrace existing solutions on their merits. Have a strong background in software engineering and programming language is not a barrier.

Have the ability to communicate openly and respectfully. Have an open mind, the zeal for continuous learning and passion to keep improving oneself.

Embrace Salesforce Ohana culture and values. Job Responsibilities : Architect, Design, Build and take ownership of business critical data and platform services.

Own and drive several self service, metadata driven data pipelines, services and applications to help ingest data from different data sources into our multi cloud petabyte scale big data platform.

Work with the product management and customer teams to understand the requirements and build generic solutions and products to set them up for success.

Build and architect solutions to simplify data ingestion, data processing, data quality and data discovery with security and data governance baked into the solution from the ground up.

Champion the service ownership model and build solutions with ample telemetry and control planes to simplify governance and operations of all the services.

Architect / re-design / re-imagine the poster of on-prem data pipelines into Hybrid cloud solutions and drive migration activities Build data frameworks that will simplify common data tasks, ease integrations, enforces best practice patterns, introduce consistency and make migrating to different tools easy.

Build data quality services that can intelligently be woven into the platform and make it easy for data analysts / engineers / stewards to always keep a tab on data quality.

Build Salesforce apps and integration mechanisms with the data ecosystem to help monitor and manage end to end data lifecycle from a single place.

Work with engineers on the design, deployment and continuous improvement of important infrastructure services Design and implement processes that would help streamline CI / CD systems for platform services on cloud Deploy, manage and maintain various tools within the UIP tech stack like Airflow, Spark, Presto, etc Work with the third party software professional services for troubleshooting and fixing the issues related to various tools that are part of the platform.

Work on building and orchestrating data pipelines and platform services for multi-cloud (AWS, GCP and First Party) hybrid data lake.

Build multi-tenant and multi-cloud data management solutions. Drive and Proactively work with the teams in Salesforce for creating new datasources and products.

Responsible for enabling Data Governance for the end to end data platform. Job Requirements : 10+ years of experience with large scale data delivery platforms, designing solutions with modern data systems to support exponential data growth.

4+ years of experience in product / application development using Java / Scala. Solid understanding of CS fundamentals, System Design and experience building large scale data intense applications / solutions.

Experience managing and driving large scale projects / products that involve managing multiple stakeholders Great coding and design competencies (Knowledge of design patterns and best practices).

Strong understanding and experience in distributed computing frameworks, particularly Apache Hadoop (YARN, MR, HDFS) & Apache Spark Knowledge of data modeling techniques and high-volume ETL / ELT design.

Strong SQL optimization and performance tuning experience in a high volume data environment that utilizes parallel processing.

Hadoop, Spark. Proven hands on experience with big data technologies like Hadoop, Kafka, MapReduce, Spark, Presto, Hive, Splunk, Airflow and others.

Experience with programming languages like Java, Scala & scripting in Python, Perl, Bash. Experience working with Public Cloud platforms like GCP, AWS, or Azure.

Fluency in one or more scripting languages such as Python / Shell Script / Go Lang. Experience designing, building, deploying systems with Containers (Docker) and Kubernetes.

In-depth, hands-on experience with Linux, networking, server, and cloud architectures. Experience with deployment tools like Terraform, Spinnaker, configuration management tools such as Puppet or Ansible.

  • Experience with AWS or GCP or another cloud PaaS provider. Solid understanding of how to configure, deploy, manage and maintain large cloud-hosted systems;
  • including auto-scaling, monitoring, performance tuning, troubleshooting and disaster recovery. Proficiency with source control (Git,GitHub), continuous integration, and testing pipelines.

    Being a great listener, collaborator, communicator, and mentor. Championing a culture and work environment that promotes diversity and inclusion Hands-on on Salesforce.

    com knowledge of product and functionality a plus. Salesforce, the Customer Success Platform and world's #1 CRM, empowers companies to connect with their customers in a whole new way.

    We are the fastest growing of the top 10 enterprise software companies, the World's Most Innovative Company according to Forbes, and one of Fortune's 100 Best Companies to Work For six years running.

    The growth, innovation, and Aloha spirit of Salesforce are driven by our incredible employees who thrive on delivering success for our customers while also finding time to give back through our 1 / 1 / 1 model, which leverages 1% of our time, equity, and product to improve communities around the world.

    Salesforce is a team sport, and we play to win. Join us! Accessibility - If you require accessibility assistance applying for open positions please contact the Recruiting Department.

    Report this job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form