The Senior Data Engineer is responsible for designing and developing data solutions using GCP technologies, emphasizing ETL processes and data architecture. The role requires expertise in SQL, Python, and various data management tools to handle large-scale data operations and implement data warehouses and analytics platforms.
Data Specialist
Primary Skills
- ETL Fundamentals, SQL, BigQuery, Dataproc, SQL (Basic + Advanced), Python, Data Catalog, Data Warehousing, Composer, Dataflow, Cloud Trace, Cloud Logging, Cloud Storage, Datafusion, Modern Data Platform Fundamentals, PLSQL, Data Modelling Fundamentals
Job requirements
- 5 years of experience in software design and development 3 years of experience in the data engineering field is preferred
- 3 years of Hands-on experience in GCP cloud data implementation suite such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage, Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms.
- Hands on Strong Experience in the below technology
- 1. GBQ Query
- 2. Python
- 3. Apache Airflow
- 4. SQL (BigQuery preferred) Extensive hands-on experience working with data using SQL and Python Cloud Functions.
- Comparable skills in AWS and other cloud Big Data Engineering space is considered.
- Experience with agile development methodologies Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions
- Bachelor's Degree in Computer Science, Information Technology, or closely related discipline
Top Skills
BigQuery
Cloud Storage
Composer
Data Catalog
Data Warehousing
Dataflow
Datafusion
Dataproc
Etl Fundamentals
Plsql
Python
SQL
Brillio Pune, Mahārāshtra, IND Office
Pune, India
Similar Jobs
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
The Senior Data Engineer will build and maintain the SailPoint Data Platform, collaborating with other engineers and data scientists. Responsibilities include developing Java-based microservices, designing data models, and improving code quality through testing and telemetry. The role demands deep knowledge in data engineering and programming, particularly with Python, Java, and SQL.
Top Skills:
AirflowDbtJavaPythonSnowflakeSparkSQL
Fintech • Financial Services
The GCP Data Engineer/Senior Engineer is responsible for designing and developing engineering solutions to achieve business goals, ensuring maintainability and integration within the business process. Key tasks include hands-on software development, architecture ownership, Agile teamwork, stakeholder communication, and mentoring junior developers.
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
SailPoint seeks a Senior Data Engineer to design and implement robust data ingestion and processing systems. Responsibilities include developing scalable data pipelines, integrating diverse data sources, leveraging AWS services, and using tools like Apache Airflow for orchestration. Candidates should have extensive experience in data engineering and relevant technologies.
Top Skills:
Apache AirflowAWSData EngineeringDbtDockerElt ProcessesFlinkHelmJenkinsKafkaKubernetesKustomizeSnowflakeSparkTerraform
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.