Databricks Architect
City: Bengaluru
State/Province: Karnataka
Posting Start Date: 2/12/26
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.
Job Description:
Job Description
Role Purpose
The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction.
D atabricks Architect
Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.• Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.• Develop PySpark/Scala-based data transformation scripts for large-scale data processing.• Ensure data quality, performance tuning, and cost optimization within Databricks.• Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.• Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.• Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.• Proficiency in PySpark for data engineering workflows.• Experience with Delta Lake, Unity Catalog, and Databricks SQL.• Hands-on experience with Kafka, APIs, and streaming data processing.• Proficiency in SQL for querying and performance tuning.• Experience in DevOps and CI/CD pipelines for Databricks.• Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.• Familiarity with Terraform, Databricks CLI, and automation frameworks
Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.• Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.• Develop PySpark/Scala-based data transformation scripts for large-scale data processing.• Ensure data quality, performance tuning, and cost optimization within Databricks.• Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.• Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.• Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.• Proficiency in PySpark for data engineering workflows.• Experience with Delta Lake, Unity Catalog, and Databricks SQL.• Hands-on experience with Kafka, APIs, and streaming data processing.• Proficiency in SQL for querying and performance tuning.• Experience in DevOps and CI/CD pipelines for Databricks.• Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.• Familiarity with Terraform, Databricks CLI, and automation frameworks
Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.• Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.• Develop PySpark/Scala-based data transformation scripts for large-scale data processing.• Ensure data quality, performance tuning, and cost optimization within Databricks.• Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.• Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.• Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.• Proficiency in PySpark for data engineering workflows.• Experience with Delta Lake, Unity Catalog, and Databricks SQL.• Hands-on experience with Kafka, APIs, and streaming data processing.• Proficiency in SQL for querying and performance tuning.• Experience in DevOps and CI/CD pipelines for Databricks.• Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.• Familiarity with Terraform, Databricks CLI, and automation frameworks
Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.• Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.• Develop PySpark/Scala-based data transformation scripts for large-scale data processing.• Ensure data quality, performance tuning, and cost optimization within Databricks.• Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.• Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.• Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.• Proficiency in PySpark for data engineering workflows.• Experience with Delta Lake, Unity Catalog, and Databricks SQL.• Hands-on experience with Kafka, APIs, and streaming data processing.• Proficiency in SQL for querying and performance tuning.• Experience in DevOps and CI/CD pipelines for Databricks.• Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.• Familiarity with Terraform, Databricks CLI, and automation frameworks
Want more jobs like this?
Get jobs in Bangalore, India delivered to your inbox every week.

Mandatory Skills: DataBricks - Data Engineering .
Experience: 8-10 Years .
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.
Perks and Benefits
Health and Wellness
Parental Benefits
Work Flexibility
Office Life and Perks
Vacation and Time Off
Financial and Retirement
Professional Development
Diversity and Inclusion