Python with Any cloud (AWS,AZURE,GCP)
I Square Soft
Chennai
5d ago

Roles and Responsibilities

Python with Any cloud (AWS,AZURE,GCP) for Bangalore / chennai / Delhi location.

pls find the below JDMandatory skills : Python, SQL, Any cloud ( AWS, AZURE, GCP) data Model. EXP 4yrs -6yrs Notice period : who are serving notice and upto 15 days joiners

Location : Bangalore , Chennai, Delhi Location

for Tredence Analytics ( Tredence.com)

Must Have (Required Skills) :

Good to Have (Preferred Skills) :

  • 2 - 3 years of experience in designing and developing Python programs for data curation and processing; Experience with object-oriented programming in Python
  • Knowledge of AWS storage, compute and serverless services, particularly S3, Lambda, Kinesis, SQS and Glue
  • Expertise in at least two of these database technologies : Relational, MPP and Distributed databases hosted in Cloud and On-Premise
  • 4 - 6 years’ overall experience in IT or professional services experience in IT delivery or large-scale IT analytics projects
  • Experience connecting and integrating with at least one of the following platforms : Google Cloud, Microsoft Azure, Amazon AWS
  • Experience with Redshift database that includes data modelling, data ingestion, integration, processing and provisioning from Redshift database
  • Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for data pipeline operations
  • Able to work in a rapidly changing business environment and adopt to the fast-paced environment quickly
  • Advanced SQL writing and experience in data exploration and using databases in a business environment with complex datasets
  • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
  • Experience with programming in any of the languages - Java, PySpark and Spark
  • Exposure to Apache Airflow
  • Exposure to any open source / commercial ETL tools such as Talend, Informatica, DataStage etc.
  • Familiar with data quality and standardization including reference data management
  • Experience with catalog, lineage and metadata management
  • Exposure to DevOps / CI & CD - Tools / services and methodologies
  • Deploy logging and monitoring across the different integration points for critical alerts
  • Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
  • Delivered data and analytics projects in any of the Cloud platforms (AWS / Azure / GCP)
  • Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
  • Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form