A.P. Moller - Maersk Logo

A.P. Moller - Maersk

Sr.Data Engineer

Posted 15 Days Ago
Be an Early Applicant
In-Office
2 Locations
Senior level
In-Office
2 Locations
Senior level
As a Senior Data Engineer, you will design and maintain data ingestion frameworks, develop transformation logic, and optimize performance in a modern lakehouse architecture while collaborating with cross-functional teams.
The summary above was generated by AI

Senior Data Engineer
 

What We’re Building – and Why You Should Care

GDA is Maersk’s bold leap into the future of data and AI. It’s not just a platform-it’s a transformation of how the world’s largest integrated logistics company turns its operational data into strategic intelligence. Think: real-time insights on vessel ETA and carbon emissions, metadata-driven supply chain automation, and retrieval-augmented copilots that advise planners and operators. Our data engineers don’t just build pipelines-they shape the very foundation that powers AI-native logistics.

As a data engineer in the GDA-FBM team, you’ll help modernize and operationalize Maersk’s global data estate. You’ll craft reusable, observable, and intelligent pipelines that enable ML, GenAI, and domain-specific data products across a multi-cloud environment. Your code won’t just move data-it’ll move trade.
 

What You'll Be Doing

  • Ingest the world: Design and maintain ingestion frameworks for high-volume, structured and unstructured data-from operational systems, APIs, file drops, and events. Support streaming and batch use cases across latency windows.
  • Transform at scale: Develop transformation logic using SQL, Python, Spark, and modern declarative tools like dbt or sqlmesh. You’ll handle deduplication, windowing, watermarking, late-arriving data, and more.
  • Curate for trust: Collaborate with domain teams to annotate datasets with metadata, ownership, PII classification, and usage lineage. Enforce naming standards, partitioning schemes, and schema evolution policies.
  • Optimize for the lakehouse: Work within a modern lakehouse architecture-leveraging Delta Lake, S3, Glue, and EMR-to ensure scalable performance and queryability across real-time and historical views.
  • Strong expertise building data modelling and dimensional modelling. Experience in SSAS/AAS.
  • Experience in Azure tech stack like ADF, ADB.
  • Build for observability: Instrument your pipelines with quality checks, cost visibility, and lineage hooks. Integrate with OpenMetadata, Prometheus, or OpenLineage to ensure platform reliability and traceability.
  • Enable production-readiness: Support deployment workflows via GitHub Actions, Terraform, and IaC patterns. Your code will be versioned, testable, and safe for multi-tenant deployments.
  • Think platform-first: Everything you build should be reusable. You’ll help codify data engineering standards, create scaffolding for onboarding new datasets, and drive automation over repetition.
     

What We’re Looking For

  • Strong foundation in data engineering: You know your way around distributed systems, columnar storage formats (Parquet, Avro), data lake performance tuning, and schema evolution.
  • Hands-on cloud experience: You’ve worked with AWS-native services like Glue, EMR, Athena, Lambda, and object storage (S3). Bonus if you’ve used Databricks, Snowflake, or Trino.
  • Modern engineering practices: Familiarity with GitOps, containerized workflows (Docker, K8s), and CI/CD pipelines for data workflows. Experience with Terraform and IaC is highly valued.
  • Programming proficiency: Fluency in Python and SQL is a must. Bonus if you’ve worked with Scala, Jinja-templated SQL, or DSL-based modeling frameworks like dbt/sqlmesh.
  • Curiosity and systems thinking: You understand the tradeoffs between batch vs streaming, structured vs unstructured, cost vs latency-and you ask why before you build.
  • Collaboration skills: You’ll work closely with ML engineers, platform architects, security teams, and domain data owners. Ability to communicate clearly and write clean, documented code is key.
     

What Makes This Role Special

  • Impact at global scale: Your work will influence container journeys, terminal operations, vessel routing, and sustainability metrics across 130+ countries and $4T+ in global trade.
  • Platform-level thinking: You’re not just solving one use case-you’re building primitives for others to reuse. This is your chance to shape a high-leverage internal data platform.
  • Freedom to experiment: We don’t believe in checkbox engineering. You’ll have space to challenge the status quo, propose better tooling, and refine the foundations of our platform stack.
  • Career-defining scope: Greenfield. Executive visibility. Cross-domain exposure. This is not a maintenance role-it’s about creating the next chapter in Maersk’s data journey.

If you’ve ever wanted to build a platform that moves not just data, but global trade-join us.

Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements.

 

We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing  [email protected]

Top Skills

Aas
Athena
AWS
Databricks
Dbt
Delta Lake
Docker
Emr
Emr
Glue
Glue
K8S
Lambda
Python
S3
Snowflake
Spark
SQL
Sqlmesh
Ssas
Terraform
Trino

Similar Jobs

59 Minutes Ago
Easy Apply
In-Office
Bangalore, Bengaluru Urban, Karnataka, IND
Easy Apply
Senior level
Senior level
Artificial Intelligence • Fintech • Hardware • Information Technology • Sales • Software • Transportation
As a Senior People Business Partner, you will support HR services, develop strategies, manage performance processes, and address employee concerns within a dedicated department.
3 Hours Ago
Hybrid
Bangalore, Bengaluru, Karnataka, IND
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
The Software Engineer at Motorola Solutions supports mission-critical push-to-talk communication platforms, troubleshoots issues, monitors performance, and develops automation tools, ensuring reliable services for customers in 24/7 environments.
Top Skills: AWSAzureDockerLinuxUnix
3 Hours Ago
Hybrid
Bangalore, Bengaluru, Karnataka, IND
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
As a Senior Developer, design and implement integration solutions for Oracle CPQ, develop features, troubleshoot issues, and guide users in a collaborative environment.
Top Skills: BmlBmqlCSSHTMLJavaScriptOracle CpqOracle Fusion ErpSalesforce

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account