Data Engineer
City: London
State/Province: London
Posting Start Date: 2/24/26
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.
Job Description:
Job Description
We are seeking a Data Engineer with strong hands-on expertise in Python, PySpark, Apache Airflow, and AWS to design, build, and optimize scalable, cloud-native data pipelines. The role involves working with large-scale batch and streaming data, implementing robust ETL frameworks, and ensuring data quality, reliability, and performance across analytics and downstream consumption layers.
Want more jobs like this?
Get jobs in London, United Kingdom delivered to your inbox every week.

Key Responsibilities
- Design, develop, and maintain scalable ETL / ELT pipelines using Python and PySpark on AWS
- Orchestrate batch and incremental workflows using Apache Airflow (DAG design, scheduling, retries, dependencies)
- Build and optimize data pipelines leveraging AWS services such as S3, EC2, Glue, Lambda, RDS, EMR
- Implement data ingestion from multiple structured and semi-structured sources (RDBMS, APIs, files, streams)
- Optimize PySpark jobs using partitioning, caching, joins, broadcast variables, and performance tuning techniques
- Ensure data quality through validation rules, schema enforcement, error handling, and reconciliation checks
- Implement CI/CD pipelines for data workflows using Git, Jenkins / AWS CodePipeline, and automated testing
- Monitor data pipelines, troubleshoot failures, and resolve performance bottlenecks in production environments
- Collaborate with Data Analysts, BI teams, Data Scientists, and Architects to deliver analytics-ready datasets
- Follow Agile/Scrum practices, participate in code reviews, and contribute to design and architecture discussions
Mandatory Skills
- Strong programming experience in Python
- Hands-on expertise with PySpark / Spark SQL
- Proven experience in Apache Airflow for workflow orchestration
- Solid experience with AWS Cloud (S3, EC2, Glue, Lambda, EMR, RDS)
- Strong understanding of ETL / ELT concepts and data pipeline design
- Experience working with large-scale datasets (batch and/or streaming)
- Proficiency with Git version control and CI/CD pipelines
- Good understanding of data warehousing concepts (fact/dimension, star schema, SCD)
Mandatory Skills: Ipython .
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.
Perks and Benefits
Health and Wellness
Parental Benefits
Work Flexibility
Office Life and Perks
Vacation and Time Off
Financial and Retirement
Professional Development
Diversity and Inclusion