The Data Engineer will design and maintain scalable semantic models using AtScale and cloud data warehouses, supporting enterprise analytics and self-service BI.
Allata is a fast-growing technology strategy and data development consulting firm delivering scalable solutions to our enterprise clients. Our mission is to inspire our clients to achieve their most strategic goals through uncompromised delivery, active listening, and personal accountability. We are a group who thrive in fast-paced environments, working on complex problems, continually learning, and working alongside colleagues to be better together.
We are seeking a skilled Data Engineer with expertise in maintaining scalable semantic models using AtScale and cloud-based data warehouse platforms.
We at IMRIEL (An Allata Company) are looking for experienced and technically strong Analytics Data Engineers to design, build, and maintain scalable semantic models using AtScale and cloud-based data warehouse platforms. This role involves developing logical cubes, defining MDX-based business measures, and enabling governed, self-service BI consumption for enterprise analytics.
Experience:3 to 5 years.
Location: Vadodara & Pune
What you'll be doing:
· Design and implement robust semantic data models using AtScale that abstract curated datasets into business-consumable layers.
· Conduct a comprehensive POC to evaluate three potential semantic layer platforms: AtScale, Microsoft Fabric and Cube.dev. This includes assessing their performance, scalability, and integration with cloud-based platforms like Databricks, Snowflake, etc.
· Develop and maintain logical cubes with calculated measures, dimension hierarchies, and drill-down paths to support self-service analytics.
· Leverage MDX (Multidimensional Expressions) to define advanced business logic, KPIs, and aggregations aligned with enterprise reporting needs.
· Configure and manage aggregate tables using AtScale Aggregate Designer, optimizing cube performance and reducing query latency.
· Integrate semantic models with BI tools such as Power BI, Tableau, and Excel Pivot Tables, ensuring seamless end-user experiences.
· Collaborate with data engineers to align semantic models with curated data sources, transformation views, and data pipelines.
· Apply star and snowflake schema design to model fact and dimension tables, ensuring optimal structure for analytical workloads.
· Implement Slowly Changing Dimensions (SCD Types 1 & 2) and maintain historical accuracy in reporting models.
· Manage row-level security (RLS) and role-based access control (RBAC) policies within semantic layers for governed data access.
· Participate in semantic model versioning, CI/CD-based deployments, and technical documentation.
· Troubleshoot semantic layer performance issues using AtScale query logs, plan analysis, and catching strategies.
What you need:
Basic Skills:
· Minimum 3 years of hands-on experience with AtScale, including building and maintaining semantic models, designing logical cubes, and implementing calculated measures using MDX. Proficiency in AtScale interface, modeling best practices, and performance tuning is essential.
· Advanced experience in developing and optimizing DAX expressions for complex calculations in Power BI models, with a proven ability to translate these into new semantic layer technologies like AtScale or Cube.dev.
· Strong experience with MDX, including creating calculated members, KPIs, and advanced expressions. Excellent SQL skills with the ability to write complex queries using joins, CTEs, window functions, and performance tuning.
· Solid understanding of dimensional modeling. Ability to design fact/dimension tables using star/snowflake schemas, support SCD logic, and maintain model consistency.
· Should be familiar with the Kimball methodology for dimensional modeling, including concepts like conformed dimensions, fact table granularity, and slowly changing dimensions, to design scalable and analytics-friendly data structures.
· Hands-on experience with Snowflake, Redshift, or BigQuery. Familiarity with virtual warehouses, caching, clustering, partitioning, and compute-storage separation.
· Experience implementing RLS and RBAC. Ability to define and enforce granular access controls within semantic models.
· Strong grasp of OLAP concepts like query abstraction, drill-down/roll-up, and cube optimization. Understanding of business logic abstraction from physical data.
· Skilled in using AtScale performance tools such as the Aggregate Designer, log analysis, and query optimization.
· Proficient in managing model development lifecycle using Git, automation tools, and collaboration workflows with data/analytics teams.
· Strong verbal and written communication to document models, explain logic, and coordinate with cross-functional teams.
Responsibilities:
· Own the design, development, deployment, and maintenance of scalable, governed semantic models.
· Implement complex MDX logic and optimized aggregate strategies to meet performance benchmarks.
· Proven ability to design and implement scalable AtScale architectures, including the development of architectural blueprints and data flow diagrams.
· Evaluate and implement the best semantic layer architecture for Power BI by leveraging tools like Microsoft Fabric or other modern BI accelerators to support self-service analytics.
· Define business measures, hierarchies, and drill-down paths in semantic models aligned with enterprise KPIs.
· Align semantic layers with upstream data transformations, curated datasets, and data warehouse architecture.
· Enforce governance and security through robust RLS and RBAC implementations.
· Continuously monitor, test, and tune semantic model performance using diagnostic tools and AtScale logging.
· Ensure semantic layer reusability, consistency, and business-aligned metric standardization.
· Collaborate with BI developers and analysts to understand reporting needs and validate model outputs.
· Maintain documentation, data lineage, and business glossaries that support transparency and user adoption.
· Contribute to reusable templates, modeling standards, and automation frameworks.
Nice-to-Have to have:
· Experience with AtScale REST APIs for metadata-driven automation and CI/CD pipelines.
· Familiarity with BI visualization platforms such as Power BI, Tableau, Looker, and Excel OLAP integration.
· Scripting experience in Python, Shell, or YAML for configuration management or automation tasks.
· Cloud certifications in Snowflake, Databricks, AWS, Azure, or Google Cloud Platform.
· Exposure to metadata management, data cataloging, or enterprise data governance tools.
At Allata, we value differences.
Allata is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Allata makes employment decisions without regard to race, color, creed, religion, age, ancestry, national origin, veteran status, sex, sexual orientation, gender, gender identity, gender expression, marital status, disability or any other legally protected category.
This policy applies to all terms and conditions of employment, including but not limited to, recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training.
Top Skills
Atscale
BigQuery
Cube.Dev
Dax
Excel
Mdx
Microsoft Fabric
Power BI
Redshift
Snowflake
SQL
Tableau
Similar Jobs
Fintech • Financial Services
As a Mongo Site Reliability Engineer Lead, you will ensure the reliability and scalability of systems, lead site reliability engineering practices, and manage database administration, focusing on MongoDB and associated technologies. Collaboration with teams and implementing automation are key responsibilities.
Top Skills:
AnsibleAWSAzureBashDb2DockerEsaasGCPGitGrafanaJIRAKubernetesMongoDBMs-SqlOpenshiftOraclePowershellPrometheusPython
eCommerce • Fintech • Information Technology • Payments • Software
Collaborate with stakeholders to identify automation needs, develop and test automation solutions, provide support, and ensure timely issue resolution while implementing SRE principles.
Top Skills:
AIGenaiMlPower AutomatePower BIPythonRpaShell ScriptingSQL
Insurance
The Software Engineer Senior Consultant II develops full-stack applications, implements CI/CD pipelines, practices test-driven development, and collaborates with teams to innovate and enhance software products.
Top Skills:
AgileContinuous IntegrationFull Stack DevelopmentMobile DevelopmentTest Driven DevelopmentWeb Development
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.