Snowflake Lead Developer
City: Bengaluru
State/Province: Karnataka
Posting Start Date: 2/23/26
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.
Job Description:
Job Description
Role Purpose
The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists.
Primary Skill: Snowflake
Secondary Skills: Azure, Cosmos DBRole SummaryWe are looking for a Level-2 Data Engineer with strong hands-on experience in Snowflake along with working knowledge of Azure services and Cosmos DB. The candidate will be responsible for building, optimizing, and maintaining data pipelines, Snowflake objects, and integrations within Azure cloud ecosystems.ResponsibilitiesDevelop and optimize ELT/ETL pipelines with Snowflake as the central data platform.Design and implement Snowflake objects-tables, views, stages, file formats, streams, tasks, and resource monitors.Perform query optimization, warehouse tuning, clustering, micro-partitioning, and cost governance.Integrate Snowflake with Azure Data Factory, ADLS, and Cosmos DB.Build secure data flows using RBAC, masking policies, secure views, and other Snowflake governance features.Work with Cosmos DB for ingestion, change feed, RU management, and query performance.Participate in requirement analysis, estimations, and Agile ceremonies.Prepare technical documentation, version control, and deployment support using CI/CD tools.Required SkillsSnowflake (Primary)3 years hands-on experience with Snowflake in Development.Strong SQL and Snowflake performance tuning.Experience with Streams & Tasks (CDC), warehouse sizing, cost optimization.Knowledge of data modeling-Star/Snowflake schemas, SCD, incremental loads.Security experience with roles, masking policies, secure views.Azure & Cosmos DB (Secondary)Experience with Azuree Data Factory (pipelines, triggers, linked services).Knowledge of Azure Storage/ADLS, Key Vault, and basic networking (managed identities, private endpoints).Cosmos DB: partition key design, RU provisioning, ingestion patterns, change feed basics.Nice to HavePython or PySpark for transformations.Experience with Databricks or dbt.Familiarity with CI/CD (Azure DevOps/GitHub).Knowledge of Power BI/Tableau.
Want more jobs like this?
Get jobs in Bangalore, India delivered to your inbox every week.

- Primary Skill: Snowflake
Secondary Skills: Azure, Cosmos DB
Role Summary
We are looking for a Level-2 Data Engineer with strong hands-on experience in Snowflake along with working knowledge of Azure services and Cosmos DB. The candidate will be responsible for building, optimizing, and maintaining data pipelines, Snowflake objects, and integrations within Azure cloud ecosystems.
Responsibilities
- Develop and optimize ELT/ETL pipelines with Snowflake as the central data platform.
- Design and implement Snowflake objects-tables, views, stages, file formats, streams, tasks, and resource monitors.
- Perform query optimization, warehouse tuning, clustering, micro-partitioning, and cost governance.
- Integrate Snowflake with Azure Data Factory, ADLS, and Cosmos DB.
- Build secure data flows using RBAC, masking policies, secure views, and other Snowflake governance features.
- Work with Cosmos DB for ingestion, change feed, RU management, and query performance.
- Participate in requirement analysis, estimations, and Agile ceremonies.
- Prepare technical documentation, version control, and deployment support using CI/CD tools.
Required Skills
Snowflake (Primary)
- 3 years hands-on experience with Snowflake in Development.
- Strong SQL and Snowflake performance tuning.
- Experience with Streams & Tasks (CDC), warehouse sizing, cost optimization.
- Knowledge of data modeling-Star/Snowflake schemas, SCD, incremental loads.
- Security experience with roles, masking policies, secure views.
Azure & Cosmos DB (Secondary)
- Experience with Azure
- e Data Factory (pipelines, triggers, linked services).
- Knowledge of Azure Storage/ADLS, Key Vault, and basic networking (managed identities, private endpoints).
- Cosmos DB: partition key design, RU provisioning, ingestion patterns, change feed basics.
Nice to Have
- Python or PySpark for transformations.
- Experience with Databricks or dbt.
- Familiarity with CI/CD (Azure DevOps/GitHub).
- Knowledge of Power BI/Tableau.
- Primary Skill: Snowflake
Secondary Skills: Azure, Cosmos DB
Role Summary
We are looking for a Level-2 Data Engineer with strong hands-on experience in Snowflake along with working knowledge of Azure services and Cosmos DB. The candidate will be responsible for building, optimizing, and maintaining data pipelines, Snowflake objects, and integrations within Azure cloud ecosystems.
Responsibilities
- Develop and optimize ELT/ETL pipelines with Snowflake as the central data platform.
- Design and implement Snowflake objects-tables, views, stages, file formats, streams, tasks, and resource monitors.
- Perform query optimization, warehouse tuning, clustering, micro-partitioning, and cost governance.
- Integrate Snowflake with Azure Data Factory, ADLS, and Cosmos DB.
- Build secure data flows using RBAC, masking policies, secure views, and other Snowflake governance features.
- Work with Cosmos DB for ingestion, change feed, RU management, and query performance.
- Participate in requirement analysis, estimations, and Agile ceremonies.
- Prepare technical documentation, version control, and deployment support using CI/CD tools.
Required Skills
Snowflake (Primary)
- 3 years hands-on experience with Snowflake in Development.
- Strong SQL and Snowflake performance tuning.
- Experience with Streams & Tasks (CDC), warehouse sizing, cost optimization.
- Knowledge of data modeling-Star/Snowflake schemas, SCD, incremental loads.
- Security experience with roles, masking policies, secure views.
Azure & Cosmos DB (Secondary)
- Experience with Azure
- e Data Factory (pipelines, triggers, linked services).
- Knowledge of Azure Storage/ADLS, Key Vault, and basic networking (managed identities, private endpoints).
- Cosmos DB: partition key design, RU provisioning, ingestion patterns, change feed basics.
Nice to Have
- Python or PySpark for transformations.
- Experience with Databricks or dbt.
- Familiarity with CI/CD (Azure DevOps/GitHub).
- Knowledge of Power BI/Tableau.
Deliver
No Performance Parameter Measure 1 Process No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team Management Productivity, efficiency, absenteeism 3 Capability development Triages completed, Technical Test performance
Mandatory Skills: Snowflake .
Experience: 5-8 Years .
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.
Perks and Benefits
Health and Wellness
Parental Benefits
Work Flexibility
Office Life and Perks
Vacation and Time Off
Financial and Retirement
Professional Development
Diversity and Inclusion