AWS Data Engineer - Ingeniero de datos de AWS
In the role of Data Engineer, you will be responsible for designing, creating, and managing an organization's data architecture, ensuring that data is structured, stored, and retrieved efficiently and securely. You will develop data models, design databases, integrate data from various sources, and establish data governance policies. Additionally, you will select appropriate data management tools and technologies, collaborate with stakeholders to understand data needs, and monitor system performance. Key skills include proficiency in database management systems, data modeling, ETL processes, and cloud platforms, along with strong analytical and communication abilities.
Locations for this position are Mexico (Mexico City, GDL and MTY) hybrid work as per Infosys Mexico's policy during CST time.
Want more jobs like this?
Get Data and Analytics jobs in Guadalajara, Mexico delivered to your inbox every week.
Basic Qualifications:
- Bachelor's degree or foreign equivalent required from an accredited institution.
- At least 8-10 years of experience in Information Technology .
- Bilingual (Spanish and English) is a must.
- Track record of implementing AWS services in a variety of business environments such as large enterprises and start-ups.
- Ability to think understand complex business requirements and render them as prototype systems with quick turnaround time.
- Strong verbal and written communication are a must, as well as the ability to work effectively across internal and external organizations and virtual teams.
- Knowledge of Infrastructure requirements such as Networking, Storage and Hardware Optimization.
- AWS Certification - AWS Solution Architect, AWS Developer or AWS Certified Big Data Specialty.
- Implementation and tuning experience in the Big Data ecosystem (such as EMR, Hadoop, Spark, R, Hive), Database (such as Oracle, Mysql, PostgreSQL), NoSQL (such as DynamoDB, Hbase, MongoDB, Cassandra), Datawarehousing (such as Redshift, Teradata, Vertica), Data migration and Integration.
- Work alongside customers to build data management platforms using EMR, Redshift, Kinesis, Amazon Machine Learning, Amazon Athena, Lake Formation, S3, AWS Glue, DynamoDB, ElastiCache and RDS.
- Well experience with other AWS Services EC2, EMR, Redshift, S3, Streaming Services like Kafka, Kinesis, HDFS.
- Strong interpersonal, communication and leadership skills.
- Critical thinker with problem solving skills.
- Self-motivated with a positive attitude and ability to work independently.
- Able to work under a tight timeline and delivery complex business requirements.
- Must be able to work flexible hours as needed and strong team player.
- Understand customer requirements and render those as architectural models that will operate at large scale and high performance. Where customers have architectures prepared, validate them against non-functional requirements and finalize the build model.
- Conversion of Hive SQL to AWS Glue based SQL.
- Security of EMS clusters using KMS keys or customer managed keys.
- Render working, high performance data management solutions, such as Cloud Formation and reusable artifacts for implementation by the customer. Bootstrap using user data scripts will be added advantage.
- Prepare architecture and design briefs that outline the key features and decision points of the application built in Data Lab.
- Work with customers to advise on changes as they put these systems live on AWS.
- Extract best practice knowledge, reference architectures, and patterns from these engagements for sharing with the worldwide AWS solution architect community.