Parexel Logo

Parexel

Senior Data Engineer

Posted Yesterday
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Hyderabad, Telangana
Senior level
In-Office or Remote
Hiring Remotely in Hyderabad, Telangana
Senior level
Design, build, and maintain data pipelines using Azure tools and collaborate with teams to translate business needs into technical solutions. Mentor junior engineers and ensure data quality and governance while implementing CI/CD practices.
The summary above was generated by AI

When our values align, there's no limit to what we can achieve.
 
At Parexel, we all share the same goal - to improve the world's health. From clinical trials to regulatory, consulting, and market access, every clinical development solution we provide is underpinned by something special - a deep conviction in what we do.

Each of us, no matter what we do at Parexel, contributes to the development of a therapy that ultimately will benefit a patient. We take our work personally, we do it with empathy and we're committed to making a difference.

Key Responsibilities

  • Architect and implement end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake for large-scale data ingestion, transformation, and storage.

  • Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting.

  • If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders.

  • Excellent grasp of and expertise with test-driven development and continuous integration processes.

  • Analysis and Design – Converts high-level design to low-level design and implements it.

  • Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans.

  • Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle.

  • Benchmark application code proactively to prevent performance and scalability concerns.

  • Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management.

  • Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments.

  • Familiarity with PowerBI and Reltio is advantageous but not required.

  • Collaborate with BI teams to ensure data models are optimized for reporting in Power BI, with a focus on performance and usability.

  • Establish data governance, quality, and security controls, ensuring compliance with GDPR, HIPAA, and global clinical data regulations.

  • Mentor and guide junior engineers, fostering technical excellence and knowledge sharing.

  • Drive automation and CI/CD practices within data engineering pipelines, integrating with version control and deployment workflows.

  • Work closely with Data Architects, Business Analysts, and Product Owners to translate business needs into technical solutions.
     

Skills:

  • Expert-level knowledge of Azure Data Factory, Databricks, and Snowflake.

  • Understanding of quality processes and estimate methods.

  • Understanding of design concepts and architectural basics.

  • Fundamental grasp of the project domain.

  • The ability to transform functional and nonfunctional needs into system requirements.

  • The ability to develop and code complicated applications is required.

  • The ability to create test cases and scenarios based on specifications.

  • Solid knowledge of SDLC and agile techniques.

  • Knowledge of current technology and trends.

  • Logical thinking and problem-solving abilities, as well as the capacity to collaborate.

  • Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO.

  • Sought: SQL, Python, PowerBI.

  • General Knowledge: PowerApps, Java/Spark, Reltio.

  • 5-7 years of experience in software development with minimum 3 years of cloud computing.

  • Proficient in SQL, Python, and cloud-native architecture.

  • Strong grasp of data security, privacy compliance, and best practices in a regulated environment.
     

Education:

  • Bachelor's Degree in technical discipline (Math’s, Science, Engineering, Computing, etc.)

Top Skills

Azure Data Factory
Databricks
Java
Powerapps
Power BI
Python
Reltio
Snowflake
Spark
SQL

Similar Jobs

6 Days Ago
Remote
India
Senior level
Senior level
Big Data • Marketing Tech
Lead the expansion of data platforms and build resilient data pipelines. Collaborate with teams to ensure data integrity and performance, while mentoring junior engineers.
Top Skills: AirflowAnsibleAWSAzureDatabricksDockerGCPKubernetesPythonRedshiftSnowflakeSQLTerraform
7 Days Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Insurance
Lead the data engineering strategy at XO Health by designing scalable data solutions, overseeing a team, and driving analytics for healthcare operations.
Top Skills: AWS
13 Days Ago
Easy Apply
Remote
India
Easy Apply
Senior level
Senior level
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
As a Senior Software Engineer on the Data Platform team, you'll design and build foundational data services, maintain data pipelines, and ensure security and observability across data systems.
Top Skills: AirflowCloud Data WarehouseData LakeGoJavaKafkaPythonSparkSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account