Fractal Logo

Fractal

Data Engineer

Reposted 2 Days Ago
In-Office
5 Locations
Senior level
In-Office
5 Locations
Senior level
As a Data Engineer, you will innovate and maintain technology services, develop software systems for large projects, and work across all SDLC phases.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As a Data Engineer -Azure, you will work in the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services.
Experience reqd - 5-10 years
RESPONSIBILITIES:
  • Be an integral part of large-scale client business development and delivery engagements
  • Develop the software and systems needed for end-to-end execution on large projects
  • Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
  • Build the knowledge base required to deliver increasingly complex technology projects
  • Team handling, problem solving, project management and communication skills & creative thinking
QUALIFICATIONS:
  • A bachelor’s degree in Computer Science or related field with 6-12 years of technology experience
  • Strong experience in System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space
  • Software development experience using: Object-oriented languages (e.g. Python, PySpark,) and frameworks
  • Database programming using any flavors of SQL
  • Expertise in relational and dimensional modelling, including big data technologies
  • Exposure across all the SDLC process, including testing and deployment
  • Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure Databricks, HD Insights, ML Service etc.
  • Good knowledge of Python and Spark are required
  • Good understanding of how to enable analytics using cloud technology and ML Ops
  • Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
  • Proven track record in keeping existing technical skills and developing new ones, so that you can make strong contributions to deep architecture discussions around systems and applications in the cloud (Azure)
  • Characteristics of a forward thinker and self-starter
  • Ability to work with a global team of consulting professionals across multiple projects
  • Knack for helping an organization to understand application architectures and integration approaches, to architect advanced cloud-based solutions, and to help launch the build-out of those systems
  • Passion for educating, training, designing, and building end-to-end systems for a diverse and challenging set of customers to success
  • GenAI - added advantage

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Azure
Azure Data Factory
Azure Data Lake Storage
Azure Databricks
Azure Sql
Hd Insights
Ml Service
Pyspark
Python
SQL

Similar Jobs

Yesterday
Hybrid
Pune, Mahārāshtra, IND
Mid level
Mid level
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
As a Data Engineer II, you will build and maintain data pipelines, ensure data quality, and support operational tasks within Mastercard's data ecosystem.
Top Skills: HadoopJavaMinioOraclePythonS3ScalaSpark
8 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Design, build, and operate scalable ETL/ELT pipelines using PySpark and AWS data services. Orchestrate workflows with Apache Airflow, implement AWS Glue jobs and Data Catalog, manage Lake Formation permissions, publish datasets for BI, and deliver QuickSight visualizations while ensuring data quality and performance.
Top Skills: Pyspark,Apache Airflow,Aws Glue,Aws Lake Formation,Aws Glue Data Catalog,Amazon Quicksight
16 Days Ago
Remote or Hybrid
India
Mid level
Mid level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
No role-specific responsibilities provided in the posting. Typically, a Unit Manager - Data Governance leads the data governance function: defining and enforcing data policies, managing data quality and metadata, coordinating stakeholders, ensuring regulatory compliance, and supervising a team to implement governance processes.

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account