Kafka Engineering Team Lead
City: London
State/Province: London
Posting Start Date: 2/3/26
Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.
Job Description:
Job Description
The Kafka Engineering Team Lead will Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale We're looking for someone who can Engineer data models and routing for multi-tenant observability; ensuring lineage, quality, and SLAs across the stream layer
Want more jobs like this?
Get jobs in London, United Kingdom delivered to your inbox every week.

Key Responsibilities:
• Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment.
• Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights).
• Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events.
• Build automated validation, replay, and backfill mechanisms for data reliability and recovery.
• Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms.
• Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation).
• Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs.
• Ensure security, compliance, and best practices for data pipelines and observability platforms.
• Document data flows, schemas, dashboards, and operational runbooks.
Required Skills:
• Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream).
• Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling.
• Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting.
• Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation.
• Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility.
• Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights).
• Understanding of hybrid cloud and multi-cluster telemetry patterns.
• Security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest.
Mandatory Skills: Redhat ansible .
Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.
Perks and Benefits
Health and Wellness
Parental Benefits
Work Flexibility
Office Life and Perks
Vacation and Time Off
Financial and Retirement
Professional Development
Diversity and Inclusion