Design and develop analytical platforms, maintain data warehouse schemas, oversee data propagation, ensure scalable architectural design for customer insights.
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionises customer engagement by transforming contact centres into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organisations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.
Competencies:
Programming Skills: Strong in Python or Java for building custom data pipelines and handling advanced data transformations.
APIs and Data Feeds: Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.
Data Modelling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalisation and denormalisation techniques.
SQL Proficiency: Advanced SQL skills for complex queries, indexing, and performance tuning.
Data Warehousing & Storage Solutions: Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.
ETL/ELT Processes: Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, and dbt) to facilitate data movement from source systems to the data warehouse.
Data Integration: Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.
Data Pipeline Management: Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.
Responsibilities:
- Design and implement analytical platforms that provide insightful dashboards to customers.
- Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access.
- Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness.
- Ensure the architectural design is extensible and scalable to adapt to future needs.
Qualifications:
- Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 Engineering institutes with relevant work experience with a top technology company.
- 3+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design.
- Hands-on experience with large-scale databases, high-scale messaging systems and real-time Job Queues.
- Experience navigating and understanding large-scale systems and complex codebases and architectural patterns.
- Proven experience in building high-scale data platforms.
- Strong expertise in data warehouse schema design (star schema, fact tables, dimensions). Experience with data movement, transformation, and integration tools for data propagation across systems.
- Ability to evaluate and implement best practices in data architecture for scalable solutions.
- Nice to have: experience with Google Cloud, Django, Postgres, Celery, and Redis.
- Some experience with AI Infrastructure and Operations.
To learn more visit : https://thelevel.ai/
Funding : https://www.crunchbase.com/organization/level-ai
LinkedIn : https://www.linkedin.com/company/level-ai/
Our AI platform : https://www.youtube.com/watch?v=g06q2V_kb-s
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analysing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
Top Skills
Amazon Redshift
Apache Airflow
Apache Kafka
Apache Nifi
Spark
Aws Glue
Azure Synapse Analytics
Celery
Dbt
Fivetran
Google Bigquery
GCP
Informatica
Java
Postgres
Python
Redis
Snowflake
Stitch
Talend
Similar Jobs
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
The Senior Software Engineer will build scalable onboarding platforms, improve user conversion, and maintain backend systems using Golang and modern technologies.
Top Skills:
DockerDynamoDBGoMongoDBPostgresRuby on Rails
Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
Join the Onboarding Platform team at Coinbase to develop scalable onboarding and KYC solutions, mentor engineers, and shape the platform's technical vision and architecture.
Top Skills:
GoGrpcMongoDB
Software
The Back-end Engineer at doola will design, develop, and maintain scalable backend services and APIs, optimize data management, and ensure security best practices while collaborating with cross-functional teams.
Top Skills:
AWSJavaMongoDBMySQLNode.jsPostgresSpring BootTerraform
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

.png)
