We are seeking a Senior Data Engineer to join our team.
This role involves a blend of engineering and analytical responsibilities. You will develop and maintain data infrastructure while also analyzing cost data, identifying trends, and working closely with teams to optimize cloud spend. With a heavy focus on AWS, this position requires a strong understanding of AWS services and cost structures. You will leverage tools like Python, Databricks, Snowflake, Airflow, and Looker to deliver impactful insights and solutions that drive cost optimizations and decision-making.
#LI-DNI
Responsibilities
- Design, build, and maintain scalable ETL pipelines to process and transform large volumes of cloud cost and usage data
- Integrate data from multiple sources, including AWS, into centralized data lakes or warehouses like Snowflake
- Develop and maintain data models to support cost analysis and reporting needs
- Optimize query performance and storage efficiency for large-scale datasets
- Automate recurring data processing tasks and implement robust monitoring for data pipelines
- Ensure data accuracy and reliability through validation processes
- Analyze cloud cost data to identify trends, anomalies, and optimization opportunities
- Work closely with teams to investigate spending changes and resolve cost anomalies
- Collaborate with stakeholders to understand cost drivers and provide actionable insights
- Support teams in building dashboards and visualizations to track key cost metrics
- Create reports and presentations to communicate findings and recommendations to leadership
- Partner with teams to develop strategies for cost reduction and operational efficiency
Want more jobs like this?
Get Data and Analytics jobs in Río Grande, Mexico delivered to your inbox every week.
- Bachelor's degree in Computer Science, Data Engineering, Data Analytics, or a related field
- 3+ years of experience in a data engineering, data analysis, or hybrid role
- Knowledge of AWS services (e.g., EC2, S3, RDS, Lambda) and their cost structures
- Proficiency in SQL and experience with relational databases like Snowflake or Redshift
- Familiarity with Databricks and Spark for large-scale data processing
- Hands-on experience with ETL tools and frameworks (e.g., Databricks, Apache Airflow)
- Programming experience in Python for data analysis and pipeline development
- Experience with BI tools like Looker or Tableau for creating dashboards and visualizations
- Excellent problem-solving skills and the ability to interpret and analyze large datasets
- Experience with Kubernetes, Docker, or containerization technologies
- Understanding of cloud cost management tools and strategies
- Strong communication skills for presenting insights to stakeholders and leadership
- Career plan and real growth opportunities
- Unlimited access to LinkedIn learning solutions
- International Mobility Plan within 25 countries
- Constant training, mentoring, online corporate courses, eLearning and more
- English classes with a certified teacher
- Support for employee's initiatives (Algorithms club, toastmasters, agile club and more)
- Enjoyable working environment (Gaming room, napping area, amenities, events, sport teams and more)
- Flexible work schedule and dress code
- Collaborate in a multicultural environment and share best practices from around the globe
- Hired directly by EPAM & 100% under payroll
- Law benefits (IMSS, INFONAVIT, 25% vacation bonus)
- Major medical expenses insurance: Life, Major medical expenses with dental & visual coverage (for the employee and direct family members)
- 13 % employee savings fund, capped to the law limit
- Grocery coupons
- 30 days December bonus
- Employee Stock Purchase Plan
- 12 vacations days plus 4 floating days
- Official Mexican holidays, plus 5 extra holidays (Maundry Thursday and Friday, November 2nd, December 24th & 31st)
- Monthly non-taxable amount for the electricity and internet bills
By applying to our role, you are agreeing that your personal data may be used as in set out in EPAM's Privacy Notice and Policy.