Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.
Job Description
Role Purpose
The purpose of this role is to provide significant technical expertise in architecture planning and design of the concerned tower (platform, database, middleware, backup etc) as well as managing its day-to-day operations
Want more jobs like this?
Get Data and Analytics jobs in London, United Kingdom delivered to your inbox every week.
Data Engineer:
- Design, develop, optimize and maintain scalable and reliable big data solutions, including data pipelines, data warehouses, and data lakes.
- Collaborate with cross-functional teams including data product managers, data scientists, analysts, and software engineers to understand business data requirements and deliver efficient solutions.
- Architect and optimize data storage, processing, and retrieval mechanisms for large-scale datasets.
- Establish scalable, efficient, automated processes for data analyses, model development, validation, and implementation.
- Implement and maintain data governance and security best practices to ensure data integrity and compliance with regulatory standards.
- Write efficient and well-organized software to ship products in an iterative, continual-release environment.
- Reporting key insight trends, using statistical rigor to simplify and inform the larger team of noteworthy story lines that impact the business.
- Troubleshoot and resolve performance issues, bottlenecks, and data quality issues in the big data infrastructure.
- Guide and mentor junior engineers, fostering a culture of continuous learning and technical excellence.
- Communicate clearly and effectively to technical and non-technical audiences.
- Contribute to internal best practices, frameworks, and reusable components to enhance the efficiency of the data engineering team.
- Embody the values and passions that characterize Levi Strauss & Co., with empathy to engage with colleagues from multiple backgrounds
Skills Required:
- University or advanced degree in engineering, computer science, mathematics, or a related field
- Experience developing and deploying data pipelines both batch and streaming into production.
- Strong experience working with a variety of relational SQL and NoSQL databases.
- Extensive experience with the cloud-native data services of Google Cloud Platform (BigQuery, Vertex AI, Pub/Sub, Cloud Functions, etc.).
- Deep expertise in one of the popular data warehousing tools such as Snowflake, Big Query, RedShift, etc
- Hands-on experience with dbt (Data Build Tool) for data transformation
- Experience working with big data tools and frameworks such as Hadoop, Spark, Kafka, etc. Familiarity with Databricks is a plus.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Hands-on experience in the Data engineering Spectrum, e.g. developing metadata-based framework-based solutions for Ingestion, Processing, etc., building Data Lake/Lake House solutions.
- Strong knowledge of Apache Airflow for orchestration and workflow management.
- Working knowledge of Github /Git Toolkit.
- Experience with providing operational support to stakeholders.
- Expertise in standard software engineering methodology, e.g. unit testing, test automation, continuous integration, code reviews, design documentation.
- Experience working with CI/CD pipelines using Jenkins and Github Actions.
- Experience with data visualization using Tableau, PowerBI, Looker or similar tools is a plus
Deliver
NoPerformance ParameterMeasure1Operations of the towerSLA adherence
Knowledge management
CSAT/ Customer Experience
Identification of risk issues and mitigation plans
Knowledge management2New projectsTimely delivery
Avoid unauthorised changes
No formal escalations
- Design, develop, optimize and maintain scalable and reliable big data solutions, including data pipelines, data warehouses, and data lakes.
- Collaborate with cross-functional teams including data product managers, data scientists, analysts, and software engineers to understand business data requirements and deliver efficient solutions.
- Architect and optimize data storage, processing, and retrieval mechanisms for large-scale datasets.
- Establish scalable, efficient, automated processes for data analyses, model development, validation, and implementation.
- Implement and maintain data governance and security best practices to ensure data integrity and compliance with regulatory standards.
- Write efficient and well-organized software to ship products in an iterative, continual-release environment.
- Reporting key insight trends, using statistical rigor to simplify and inform the larger team of noteworthy story lines that impact the business.
- Troubleshoot and resolve performance issues, bottlenecks, and data quality issues in the big data infrastructure.
- Guide and mentor junior engineers, fostering a culture of continuous learning and technical excellence.
- Communicate clearly and effectively to technical and non-technical audiences.
- Contribute to internal best practices, frameworks, and reusable components to enhance the efficiency of the data engineering team.
- Embody the values and passions that characterize Levi Strauss & Co., with empathy to engage with colleagues from multiple backgrounds
Skills Required:
- University or advanced degree in engineering, computer science, mathematics, or a related field
- Experience developing and deploying data pipelines both batch and streaming into production.
- Strong experience working with a variety of relational SQL and NoSQL databases.
- Extensive experience with the cloud-native data services of Google Cloud Platform (BigQuery, Vertex AI, Pub/Sub, Cloud Functions, etc.).
- Deep expertise in one of the popular data warehousing tools such as Snowflake, Big Query, RedShift, etc
- Hands-on experience with dbt (Data Build Tool) for data transformation
- Experience working with big data tools and frameworks such as Hadoop, Spark, Kafka, etc. Familiarity with Databricks is a plus.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Hands-on experience in the Data engineering Spectrum, e.g. developing metadata-based framework-based solutions for Ingestion, Processing, etc., building Data Lake/Lake House solutions.
- Strong knowledge of Apache Airflow for orchestration and workflow management.
- Working knowledge of Github /Git Toolkit.
- Experience with providing operational support to stakeholders.
- Expertise in standard software engineering methodology, e.g. unit testing, test automation, continuous integration, code reviews, design documentation.
- Experience working with CI/CD pipelines using Jenkins and Github Actions.
- Experience with data visualization using Tableau, PowerBI, Looker or similar tools is a plus
If you encounter any suspicious mail, advertisements, or persons who offer jobs at Wipro, please email us at helpdesk.recruitment@wipro.com. Do not email your resume to this ID as it is not monitored for resumes and career applications.
Any complaints or concerns regarding unethical/unfair hiring practices should be directed to our Ombuds Group at ombuds.person@wipro.com.
We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, caste, creed, religion, gender, marital status, age, ethnic and national origin, gender identity, gender expression, sexual orientation, political orientation, disability status, protected veteran status, or any other characteristic protected by law.
Wipro is committed to creating an accessible, supportive, and inclusive workplace. Reasonable accommodation will be provided to all applicants including persons with disabilities, throughout the recruitment and selection process. Accommodations must be communicated in advance of the application, where possible, and will be reviewed on an individual basis. Wipro provides equal opportunities to all and values diversity.