A.P. Moller - Maersk Logo

A.P. Moller - Maersk

Infrastructure Engineer

Posted 3 Days Ago
Be an Early Applicant
In-Office
2 Locations
Mid level
In-Office
2 Locations
Mid level
Design and maintain data ingestion frameworks, optimize lakehouse architectures, and build reusable data pipelines for machine learning across a multi-cloud environment.
The summary above was generated by AI

Maersk’s bold leap into the future of data and AI. It’s not just a platform-it’s a transformation of how the world’s largest integrated logistics company turns its operational data into strategic intelligence. Think: real-time insights on vessel ETA and carbon emissions, metadata-driven supply chain automation, and retrieval-augmented copilots that advise planners and operators. Our data engineers don’t just build pipelines-they shape the very foundation that powers AI-native logistics. You’ll help modernize and operationalize Maersk’s global data estate. You’ll craft reusable, observable, and intelligent pipelines that enable ML, GenAI, and domain-specific data products across a multi-cloud environment. Your code won’t just move data-it’ll move trade.
What You'll Be Doing

  • Ingest the world: Design and maintain ingestion frameworks for high-volume, structured and unstructured data-from operational systems, APIs, file drops, and events. Support streaming and batch use cases across latency windows.
  • Transform at scale: Develop transformation logic using SQL, Python, Spark, and modern declarative tools like dbt or sqlmesh. You’ll handle deduplication, windowing, watermarking, late-arriving data, and more.
  • Curate for trust: Collaborate with domain teams to annotate datasets with metadata, ownership, PII classification, and usage lineage. Enforce naming standards, partitioning schemes, and schema evolution policies.
  • Optimize for the lakehouse: Work within a modern lakehouse architecture-leveraging Delta Lake, S3, Glue, and EMR-to ensure scalable performance and queryability across real-time and historical views.
  • Build for observability: Instrument your pipelines with quality checks, cost visibility, and lineage hooks. Integrate with OpenMetadata, Prometheus, or OpenLineage to ensure platform reliability and traceability.
  • Enable production-readiness: Support deployment workflows via GitHub Actions, Terraform, and IaC patterns. Your code will be versioned, testable, and safe for multi-tenant deployments.
  • Think platform-first: Everything you build should be reusable. You’ll help codify data engineering standards, create scaffolding for onboarding new datasets, and drive automation over repetition.

What We’re Looking For

  • Strong foundation in data engineering: You know your way around distributed systems, columnar storage formats (Parquet, Avro), data lake performance tuning, and schema evolution.
  • Hands-on cloud experience: You’ve worked with AWS-native services like Glue, EMR, Athena, Lambda, and object storage (S3). Bonus if you’ve used Databricks, Snowflake, or Trino.
  • Modern engineering practices: Familiarity with GitOps, containerized workflows (Docker, K8s), and CI/CD pipelines for data workflows. Experience with Terraform and IaC is highly valued.
  • Programming proficiency: Fluency in Python and SQL is a must. Bonus if you’ve worked with Scala, Jinja-templated SQL, or DSL-based modeling frameworks like dbt/sqlmesh.
  • Curiosity and systems thinking: You understand the tradeoffs between batch vs streaming, structured vs unstructured, cost vs latency-and you ask why before you build.
  • Collaboration skills: You’ll work closely with ML engineers, platform architects, security teams, and domain data owners. Ability to communicate clearly and write clean, documented code is key.

What Makes This Role Special

  • Impact at global scale: Your work will influence container journeys, terminal operations, vessel routing, and sustainability metrics across 130+ countries and $4T+ in global trade.
  • Platform-level thinking: You’re not just solving one use case-you’re building primitives for others to reuse. This is your chance to shape a high-leverage internal data platform.
  • Freedom to experiment: We don’t believe in checkbox engineering. You’ll have space to challenge the status quo, propose better tooling, and refine the foundations of our platform stack.
  • Career-defining scope: Greenfield. Executive visibility. Cross-domain exposure. This is not a maintenance role-it’s about creating the next chapter in Maersk’s data journey.

Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements.

 

We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing  [email protected]

Top Skills

AWS
Dbt
Delta Lake
Emr
Glue
Python
S3
Spark
SQL
Sqlmesh

Similar Jobs

3 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Financial Services
The Lead Infrastructure Engineer will manage infrastructure solutions, enhance security processes, automate tasks, and mentor team members while collaborating with stakeholders.
Top Skills: AnsibleBluecoatBroadcomInfrastructure PlatformsNetskopeProxy SgPythonRest ApiSecure Web GatewaySeim ToolsShell ScriptingSsl Visibility AppliancesSymantec Content Analysis SystemsSymantec Management CenterZscaler
7 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Financial Services
As a Lead Infrastructure Engineer, you'll design and implement vendor solutions, optimize data flows, develop automation tools, and build monitoring dashboards while collaborating with cross-functional teams.
Top Skills: BashDynatraceGrafanaJSONLinuxOracle DatabasePythonRestSoapSplunkSQLXML
8 Days Ago
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Financial Services
As an Infrastructure Engineer II, you will support web applications, drive performance testing, manage incidents, and collaborate with teams for infrastructure solutions.
Top Skills: AnsibleApache TomcatBitbucketGitIhsIisJenkinsPythonShell ScriptingTerraformWebsphere

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account