Data Engineer (Snowflake)
Yesterday• São Paulo, Brazil
Infosys Brazil is looking for professionals with the profile of Data Engineer (Snowflake) to join our team.
Required qualifications:
- Strong expertise in designing, supporting, and developing batch and streaming data pipelines using Azure Cloud Services and Snowflake.
- Advanced proficiency in Python, PySpark, and Snowflake, with the ability to work on complex data engineering solutions.
- Solid understanding of ETL/ELT processes, including hands-on work with Databricks and Azure Data Factory, integrating with APIs, Event Hubs, and additional data sources.
- Knowledge of Azure Cloud architecture, including storage, security, and resource optimization best practices.
- Strong focus on system maintenance, with the ability to proactively identify and quickly resolve pipeline or job-related issues.
- Experience with Git-based version control (GitHub, GitLab, Bitbucket), applying branching strategies, pull requests, and code reviews.
- Excellent communication skills in English for clear interactions with users and stakeholders.
- Understanding of Agile methodologies and collaboration in an agile delivery environment.
Desired qualifications (nice to have):
- Professional certifications in Snowflake or Databricks.
- Familiarity with data quality frameworks and monitoring strategies.
- Ability to work independently with high levels of ownership and proactivity.
- Strong analytical and problem-solving capabilities, combined with adaptability and teamwork.
Main activities and responsibilities:
- Monitor and respond to alert notifications regarding execution failures across tools such as Databricks, Azure Data Factory, and internal data-quality reports.
- Review monitoring dashboards to identify issues and create the corresponding PBIs or user stories for corrective actions.
- Engage proactively with users through team communication channels, capturing operational issues and performing required fixes.
- Work extensively with Azure Cloud Services, Azure Data Factory, Databricks, Python, Spark, and Snowflake, performing investigations related to missing data, pipeline failures, and schedule deviations.
- Execute break-in requests by conducting issue investigations, root-cause analyses, and remediation for pipelines that fail or behave unexpectedly.
Additional information:
Type of work: CLT, Flexible hours.
Want more jobs like this?
Get Data and Analytics jobs in São Paulo, Brazil delivered to your inbox every week.

Benefits package: Transport vouchers, meal vouchers, childcare assistance, life insurance, funeral assistance, medical and dental insurance.
Career: Global work on national and international clients and/or projects, possibility of international career movement.
Client-provided location(s): São Paulo, Brazil
Job ID: Infosys-145784BR
Employment Type: OTHER
Posted: 2026-05-06T18:44:46
Perks and Benefits
Health and Wellness
- Health Insurance
- Life Insurance
- HSA
- Short-Term Disability
Parental Benefits
- Birth Parent or Maternity Leave
- Non-Birth Parent or Paternity Leave
- On-site/Nearby Childcare
Work Flexibility
Office Life and Perks
- Commuter Benefits Program
Vacation and Time Off
- Paid Vacation
- Paid Holidays
- Personal/Sick Days
- Sabbatical
Financial and Retirement
- 401(K)
- Relocation Assistance
Professional Development
- Learning and Development Stipend
Diversity and Inclusion
- Employee Resource Groups (ERG)