Cloud Data Engineer
Cloud Data Engineer
Location: Mexico (Mexico City, Guadalajara, Monterrey) - Hybrid as per Infosys Mexico policy
Time Zone: CST/EST
Role Overview
As a Cloud Engineer, you will work directly with clients to build and support data engineering solutions. Your work will include creating data pipelines, supporting data lake projects, building dashboards for reporting, and defining and implementing scalable workflows. This role requires strong hands-on skills in Python and SQL and prior working experience with a Cloud Platform (AWS Preferred), with a focus on delivering reliable, scalable cloud-based data systems. You will collaborate with multiple teams and help ensure that data processes run smoothly and efficiently.
Key Responsibilities
- Build and maintain Data Pipelines for Client's Data Engineering / Science needs.
- Work on Data Lake development and related Cloud initiatives.
- Create Dashboards and Support Executive Reporting Requirements.
- Define and implement Workflow and Process designs.
- Work closely with clients to understand Requirements and Deliver Solutions.
- Collaborate with cross-functional teams in a fast-moving environment.
- Troubleshoot issues and help improve Data Quality and System Performance.
Want more jobs like this?
Get jobs in Guadalajara, Mexico delivered to your inbox every week.

Basic Qualifications
- Bachelor's degree or foreign equivalent in Statistics, Computer Science, Mathematics or related field.
- At least 7+ years of experience in Information Technology working on building Data Pipelines.
- Bilingual (Spanish & English) - strong verbal and written communication.
Mandatory Skills
- Minimum 5 years of hands-on experience working with Python and SQL and PL/SQL.
- Minimum 3 years working experience with a Cloud Platform (AWS Preferred)
- Strong understanding of ETL concepts and Data Processing.
- Experience working in Fast Paced Environment with Agile Scrum using JIRA
Preferred Skills
- Exposure to Big Data technologies (Hadoop, Python/Scala/Spark/PySpark).
- Experience working with Scheduling Tool like Airflow (MWAA), Control-M, Tidal
- Experience with AWS Redshift or Snowflake or Azure Synapse.
- Experience with Large Scale Data Warehouse Implementation and use of ETL Techniques.
- Experience working with CI/CD Platforms like Gitlab, Bitbucket and Deployment Pipelines
- Knowledge of Unix shell scripting.
Other Requirements
- Ability to work under tight timelines and manage complex requirements.
- Flexible to work during CST/EST time zones and meet for hand-over activities to India team if needed.
- Self-motivated, proactive, and a strong team player.
Perks and Benefits
Health and Wellness
- Health Insurance
- Life Insurance
- HSA
- Short-Term Disability
Parental Benefits
- Birth Parent or Maternity Leave
- Non-Birth Parent or Paternity Leave
- On-site/Nearby Childcare
Work Flexibility
Office Life and Perks
- Commuter Benefits Program
Vacation and Time Off
- Paid Vacation
- Paid Holidays
- Personal/Sick Days
- Sabbatical
Financial and Retirement
- 401(K)
- Relocation Assistance
Professional Development
- Learning and Development Stipend
Diversity and Inclusion
- Employee Resource Groups (ERG)