Data Engineer, AugIntel
Cardinal Health
Bangalore, Karnataka, India
5d ago

Data Engineer, AugIntel

Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE : CAH) is a global, integrated healthcare services and products company connecting patients, providers, payers, pharmacists and manufacturers for integrated care coordination and better patient management.

Backed by nearly 100 years of experience, with more than 50,000 employees in nearly 60 countries, Cardinal Health ranks among the top 20 on the Fortune 500.

Department Overview :

augmented Intelligence (augIntel) builds automation, analytics and artificial intelligence solutions that drive success for Cardinal Health by creating material savings, efficiencies and revenue growth opportunities.

The team drives business innovation by leveraging emerging technologies and turning them into differentiating business capabilities.

Job Overview :

Designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with technologies like Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, Airflow.

Responsibilities :

  • Designing and implementing data transformation, ingestion and curation functions on GCP cloud using GCP native or custom programming
  • Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python etc.
  • Optimizing data pipelines for performance and cost for large scale data lakes
  • Desired Qualifications :

  • Bachelor's degree preferred or equivalent work experience
  • 5+ years of architecture and engineering experience in Big Data systems, Data Analytics and Data Integration related fields
  • 1+ years of hands-on GCP experience in Data Engineering, Cloud Analytics solutions and experience with operationalizing Enterprise scale solutions
  • Hands-on experience architecting and designing data lakes on GCP cloud serving analytics and BI application integrations
  • Hands-on experience with Data Ingestion technologies like GCP DataFlow, Fusion, and AirFlow
  • 3+ years of experience writing complex SQL queries, stored procedures, etc
  • Experience introducing and operationalizing self-service data preparation tools (e.g. Alteryx, Dta Prep, Trifacta, Paxata) on GCP
  • Experience architecting and implementing metadata management on GCP
  • Architecting and implementing data governance and security for data platforms on GCP
  • Agile development skills and experience.
  • Experience with CI / CD pipelines such as Concourse, Jenkins
  • Google Cloud Platform certification is a plus
  • Report this job

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Application form