Cardinal Health Logo

Cardinal Health

Data Engineer, Data Engineering

Reposted 16 Days Ago
Be an Early Applicant
IND
Senior level
IND
Senior level
Design and implement large-scale data solutions on Google Cloud Platform, optimizing data pipelines and managing analytics operations.
The summary above was generated by AI

Data Engineer

Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE: CAH) is a global, integrated healthcare services and products company connecting patients, providers, payers, pharmacists and manufacturers for integrated care coordination and better patient management. Backed by nearly 100 years of experience, with more than 50,000 employees in nearly 60 countries, Cardinal Health ranks among the top 20 on the Fortune 500. 

   

Department Overview: 

AAA (Advanced Analytics & Automation) builds automation, analytics and artificial intelligence solutions that drive success for Cardinal Health by creating material savings, efficiencies and revenue growth opportunities. The team drives business innovation by leveraging emerging technologies and turning them into differentiating business capabilities. 

  

Job Overview: 

Designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with technologies like Cloud Dataflow, Cloud BigQuery, Cloud PubSub, Cloud Functions, Airflow. 

Responsibilities: 

  • Designing and implementing data transformation, ingestion and curation functions on GCP cloud using GCP native or custom programming.
  • Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using SQL, Python & other orchestration tools.
  • Optimizing data pipelines for performance and cost for large scale data lakes.

   

Desired Qualifications: 

  

  • Bachelor's degree preferred or equivalent work experience 
  • 5+ years of engineering experience in Big Data systems, Data Analytics and Data Integration related fields.
  • 3+ years of experience writing complex SQL queries, stored procedures, etc 
  • 1+ years of hands-on GCP experience in Data Engineering and Cloud Analytics solutions.
  • Understanding of Data Modelling and Dimension modelling concepts.
  • Hands-on experience with Data Ingestion technologies like GCP DataFlow, Orchestration tool like AirFlow and scripting languages like Python or Shell Scripting.
  • Experience in designing and optimizing data models on GCP cloud using GCP data stores such as BigQuery.
  • Agile development skills and experience. 
  • Experience with CI/CD pipelines such as Concourse, Jenkins 
  • Google Cloud Platform certification is a plus.
  • Hands on experience in any data visualization tools like Looker or Tableau is a plus.

Candidates who are back-to-work, people with disabilities, without a college degree, and Veterans are encouraged to apply.

Cardinal Health supports an inclusive workplace that values diversity of thought, experience and background. We celebrate the power of our differences to create better solutions for our customers by ensuring employees can be their authentic selves each day. Cardinal Health is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, ancestry, age, physical or mental disability, sex, sexual orientation, gender identity/expression, pregnancy, veteran status, marital status, creed, status with regard to public assistance, genetic status or any other status protected by federal, state or local law.

To read and review this privacy notice click here

Top Skills

Airflow
Ci/Cd
Cloud Bigquery
Cloud Dataflow
Cloud Functions
Cloud Pubsub
Google Cloud Platform
Looker
Python
SQL
Tableau

Similar Jobs

Yesterday
Hybrid
Hyderabad, Telangana, IND
Junior
Junior
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
The Associate Developer will design, develop, and maintain data pipelines and applications using Java and Big Data technologies in a cloud environment.
Top Skills: AWSGCPHadoopHdfsJavaPythonScalaSparkSQLUnix/Linux
Yesterday
Hybrid
Hyderabad, Telangana, IND
Expert/Leader
Expert/Leader
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
Lead and mentor a team of engineers in designing and implementing Big Data and AI solutions. Develop applications and data pipelines, optimize cloud-based infrastructure, and enhance team collaboration.
Top Skills: AIAthenaAWSBig DataBigQueryGCPHadoopHdfsJavaPythonPyTorchScalaScikit-LearnSnowflakeSparkSQLTensorFlow
6 Days Ago
Hybrid
Hyderabad, Telangana, IND
Mid level
Mid level
Financial Services
The Software Engineer III designs and delivers technology solutions, troubleshoots problems, and develops secure production code using large data sets.
Top Skills: Agile MethodologiesApplication ResiliencyArtificial IntelligenceCi/CdCloudData WarehouseMachine LearningModern Programming LanguagesPyspark

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account