Globalization Partners Logo

Globalization Partners

Sr. Data Engineer

Posted 2 Hours Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
The Senior Data Engineer will design and maintain data infrastructure, build data pipelines, and ensure data quality across platforms while collaborating with cross-functional teams.
The summary above was generated by AI

About Us

Our leading SaaS-based Global Employment Platform™ enables clients to expand into over 180 countries quickly and efficiently, without the complexities of establishing local entities. At G-P, we’re dedicated to breaking down barriers to global business and creating opportunities for everyone, everywhere.

Our diverse, remote-first teams are essential to our success. We empower our Dream Team members with flexibility and resources, fostering an environment where innovation thrives and every contribution is valued and celebrated.

The work you do here will positively impact lives around the world. We stand by our promise: Opportunity Made Possible. In addition to competitive compensation and benefits, we invite you to join us in expanding your skills and helping to reshape the future of work.

At G-P, we assist organizations in building exceptional global teams in days, not months—streamlining the hiring, onboarding, and management process to unlock growth potential for all.

About this Position

As a Senior Data Engineer at Globalization Partners, you will be responsible for designing, building, and maintaining scalable data infrastructure and pipelines that power our global platform. You'll work with cutting-edge technologies to process vast amounts of data, ensuring reliability, performance, and quality across our data ecosystem.

What You Can Expect to Do:
  • Design and implement robust data pipelines for both batch and streaming data processing, handling structured and unstructured data from various sources across our global platform ecosystem.
  • Build and optimize ETL/ELT processes that efficiently transform and load data into our data warehouse and analytics platforms, ensuring data quality and consistency throughout the pipeline.
  • Develop scalable data architectures using cloud-native technologies and modern data platforms, contributing to architectural decisions that support business growth and technical excellence.
  • Collaborate with cross-functional teams including Data Scientists, Analytics Engineers, Product Managers, and Engineering teams to understand data requirements and deliver solutions that enable data-driven decision making.
  • Implement data quality frameworks and monitoring systems to ensure data accuracy, completeness, and reliability across all data assets, proactively identifying and resolving data issues.
  • Optimize data platform performance by analyzing query patterns, implementing efficient data models, and fine-tuning system configurations to handle growing data volumes and user demands.
  • Maintain and enhance data infrastructure including data warehouses, lakes, and processing clusters, ensuring high availability, security, and cost-effectiveness of our data systems.
  • Champion engineering best practices including code reviews, automated testing, CI/CD processes, and documentation to maintain high-quality, maintainable data solutions.
  • Support data governance initiatives by implementing cataloging, lineage tracking, and access controls that enable secure and compliant data usage across the organization.
  • Mentor junior engineers and contribute to the team's technical growth by sharing knowledge, conducting code reviews, and promoting a culture of continuous learning and innovation.
What We Are Looking For:
  • 6+ years of experience in data engineering with a strong track record of building and maintaining production data systems at scale.
  • Expert-level proficiency in Python with experience in data processing libraries (PySpark, Pandas, NumPy)
  • Advanced SQL skills with deep understanding of query optimization, indexing strategies, and performance tuning across various database systems.
  • Hands-on experience with modern data platforms such as Databricks, Snowflake, or cloud-native solutions (AWS Redshift, Google BigQuery, Azure Synapse) including best practices for enterprise deployments.
  • Strong background in ETL/ELT processes with experience using tools like Apache Airflow, dbt, Fivetran, or similar orchestration and transformation frameworks.
  • Proficiency with cloud platforms (preferably AWS) including services like S3, Lambda, Glue, EMR, Kinesis, and understanding of cloud-native data architectures.
  • Experience with streaming data technologies such as Apache Kafka, AWS Kinesis, or similar real-time data processing systems.
  • Knowledge of data modeling techniques including dimensional modeling, data vault methodology, and modern approaches like wide tables and denormalization strategies.
  • Familiarity with data governance tools such as Atlan, Alation, Informatica, or Collibra, and experience implementing data cataloging and lineage tracking.
  • Understanding of data quality tools and practices with experience using platforms like Monte Carlo, Great Expectations, or custom quality frameworks.
  • Experience with Infrastructure as Code (Terraform, CloudFormation).
  • Strong analytical and problem-solving skills with ability to troubleshoot complex data issues, optimize performance, and design solutions for ambiguous requirements.
  • Excellent communication skills with experience collaborating with technical and non-technical stakeholders to translate business requirements into technical solutions.
  • Bachelor's degree in Computer Science, Engineering, or related field; advanced degrees or equivalent professional experience are valued.
  • Fluent in English both verbal and written, with ability to work effectively in a global, distributed team environment.

We will consider for employment all qualified applicants who meet the inherent requirements for the position. Please note that background checks are required, and this may include criminal record checks. 


Individuals with disabilities are encouraged to apply for these positions.


G-P. Global Made Possible.

G-P is a proud Equal Opportunity Employer, and we are committed to building and maintaining a diverse, equitable and inclusive culture that celebrates authenticity. We prohibit discrimination and harassment against employees or applicants on the basis of race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth, and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other legally protected status.

G-P also is committed to providing reasonable accommodations to individuals with disabilities. Individuals with disabilities are encouraged to apply for these positions. If you need an accommodation due to a disability during the interview process, please contact us at [email protected].

Top Skills

Apache Airflow
Apache Kafka
AWS
Aws Redshift
Azure Synapse
CloudFormation
Databricks
Dbt
Emr
Fivetran
Glue
Google Bigquery
Kinesis
Lambda
Numpy
Pandas
Pyspark
Python
S3
Snowflake
SQL
Terraform

Similar Jobs

Yesterday
Remote
India
Senior level
Senior level
Information Technology
As a Senior Data Engineer, you will design and implement data solutions, optimizing performance and ensuring security using various cloud platforms and technologies.
Top Skills: AirflowAWSAzureDatabricksDbtFivetranGCPJavaKafkaMatillionPythonScalaSnowflakeSparkSQL
2 Days Ago
Remote
India
Senior level
Senior level
Cloud • Information Technology • Machine Learning • Mobile
The Senior Data Engineer will support strategic data initiatives, focusing on Silver/Gold layer development, replatforming, and collaboration on documentation. Responsibilities include data quality checks, model updates, and supporting business metrics definitions.
Top Skills: DatabricksGitMedallion ArchitecturePythonSQLUnity Catalog
3 Days Ago
In-Office or Remote
8 Locations
Senior level
Senior level
Blockchain • Gaming • Software • Virtual Reality • Metaverse
Lead the evolution of decentralized data infrastructure, design and maintain ETL/ELT pipelines, and collaborate across teams to derive insights from data.
Top Skills: AirflowAWSDbtMeltanoMetabasePythonSegmentSnowflake

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account