Sysco Logo

Sysco

Engineer - Data Engineering

Reposted 6 Hours Ago
Be an Early Applicant
Remote
Hiring Remotely in Sri Lanka
Junior
Remote
Hiring Remotely in Sri Lanka
Junior
The Data Engineer will design and develop data solutions, implement data processing applications, maintain data pipelines, and support the full software lifecycle in an Agile environment.
The summary above was generated by AI
JOB DESCRIPTION

Engineer - Data Engineering 

 

About Sysco LABS:  

Sysco LABS is the Global In-House Center of Sysco Corporation (NYSE: SYY), the world’s largest foodservice company. Sysco ranks 56th in the Fortune 500 list and is the global leader in the trillion-dollar foodservice industry.   

Sysco employs over 75,000 associates, has 337 smart distribution facilities worldwide and over 14,000 IoT-enabled trucks serving 730,000 customer locations. For fiscal year 2025 that ended June 29, 2025, the company generated sales of more than $81.4 billion.

   

Sysco LABS Sri Lanka delivers the technology that powers Sysco’s end-to-end operations.

     

Sysco LABS’ enterprise technology is present in the end-to-end foodservice journey, enabling the sourcing of food products, merchandising, storage and warehouse operations, order placement and pricing algorithms, the delivery of food and supplies to Sysco’s global network and the in-restaurant dining experience of the end-customer. 

 

The Opportunity 

 

We are currently on the lookout for a Data Engineer to join our team.  

 

Responsibilities: 

  • Designing and developing data solutions for one of the world’s largest corporations involved in the marketing and distribution of food products 

  • Implementing distributed and highly available data processing applications that scale for enterprise demands on Cloud Services such as GCP 

  • Design, develop and maintain data pipelines integrating multiple source systems and targets. 

  • Adhering to Continuous Integration and Continuous Delivery of solutions 

  • Ensuring high code quality by following software engineering best practices 

  • Working collaboratively in a cross-functional team in an Agile delivery environment 

  • Adhering to DevOps principles and being involved in projects throughout their full software lifecycle: from development, QA, and deployment to post-production support 

 

Requirements: 

  • A Bachelor’s Degree in Computer Science or equivalent, and 1-2 years of experience in developing enterprise grade data processing applications 

  • A strong programming background in data ops (Python, Shell, SQL) 

  • Experience in processing large volumes of data 

  • Hands-on experience working with relational/NoSQL databases and distributed storage engines 

  • Hands-on experience in ETL/ELT design and development using ETL/ELT tools such as but not limited to Airflow, Google Cloud Services such as Cloud Composer, Cloud Dataflow, Cloud Dataproc and Amazon Web Services such as Data Pipelines, Glue, Lambda, EMR, Spark, Hive 

  • Experience in working with streaming data (using tools such as Pub/Sub, Kafka, Storm, Spark) will be an added advantage 

  • Hands-on experience with Google Cloud Platform (GCP): Experience with services & serverless functions (e.g., BigQuery, Datastream, DataFlow, Pub/Sub, Cloud Functions, Cloud Run, Cloud composer, etc.) for data processing and orchestration 

  • Experience working in a Scrum Agile delivery environment and DevOps practices 

  • Experience in code management and CI/CD tools such as Github, Gitlab and Jenkins 

  • Experience in an agile environment and aligning POD members on technical visions and path to implementation 

  • A strong desire to continue to grow your skillset 

  • Strong communication skills that are influential and convincing 

  • Experience with application monitoring (Datadog, or equivalent) will be an added advantage 

  • A passion for building and maintaining data solutions (data warehousing, data marts, data lakes, data mesh) will be an added advantage 

 

Benefits:  

  • US dollar-linked compensation   

  • Performance-based annual bonus   

  • Performance rewards and recognition   

  • Agile Benefits - special allowances for Health, Wellness & Academic purposes   

  • Paid birthday leave  

  • Team engagement allowance   

  • Comprehensive Health & Life Insurance Cover - extendable to parents and in-laws   

  • Overseas travel opportunities and exposure to client environments   

  • Hybrid work arrangement   

  

  

Sysco LABS is an Equal Opportunity Employer. 

Top Skills

Airflow
Amazon Web Services
BigQuery
Cloud Functions
Cloud Run
Dataflow
Datastream
Git
Gitlab
Google Cloud Services
Jenkins
Pub/Sub
Python
Shell
SQL

Similar Jobs

3 Hours Ago
Remote
Sri Lanka
Junior
Junior
Food • Logistics
The role involves designing and developing data solutions, implementing data processing applications on cloud, ensuring high code quality, and collaborating in an Agile team.
Top Skills: AirflowAngularjsAWSBigQueryCloud ComposerCloud DataflowCloud DataprocData PipelinesEmrGitGitlabGlueGoogle Cloud ServicesHiveHTMLInformaticaJenkinsKafkaKinesisLambdaNode.jsPub/SubPythonRedshiftS3ShellSparkSQLStorm
6 Hours Ago
Remote
Sri Lanka
Senior level
Senior level
Food • Logistics
The Senior Engineer - Data Engineering designs and develops scalable data solutions, implements data processing applications, and maintains data pipelines within an Agile environment, ensuring high code quality and adherence to DevOps practices.
Top Skills: Aws Data PipelinesBigQueryCloud ComposerCloud FunctionsCloud RunDatadogDataflowDatastreamEmrGitGitlabGlueGoogle Cloud PlatformHiveInformaticaJenkinsKafkaLambdaPub/SubPythonShellSparkSQLStorm
3 Days Ago
Remote
Sri Lanka
Junior
Junior
Food • Logistics
Design and develop Tableau visualizations, support full-stack analytics solutions, manage user access, and provide training for Tableau users.
Top Skills: CognosExcelMS OfficeSQLTableau

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account