Are you a Data Analytics specialist? Do you have Data Warehousing, Hadoop/Data Lake experience? Do you like to solve the most complex and high scale (billions + records) data challenges in the world today? Do you like to work on-site in a variety of business environments, leading teams through high impact projects that use the newest data analytic technologies? Would you like a career path that enables you to progress with the rapid adoption of cloud computing?
At Amazon Web Services (AWS), we're hiring highly technical cloud computing architects to collaborate with our customers and partners on key engagements. Our consultants will develop, deliver and implement AI, IOT, and data analytics projects that help our customers leverage their data to develop business insights. These professional services engagements will focus on customer solutions such as machine learning, IoT, batch/real-time data processing, data and business intelligence.
This is a customer facing role. You will be required to travel to client locations and deliver professional services when needed.
- Expertise - Collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Athena, Glue, Lambda, S3, DynamoDB, NoSQL, Relational Database Service (RDS), Amazon EMR and Amazon Redshift.
- Solutions - Deliver on-site technical engagements with partners and customers. This includes participating in pre-sales on-site visits, understanding customer requirements, creating packaged Data & Analytics service offerings.
- Delivery - Engagements include short on-site projects proving the use of AWS services to support new distributed computing solutions that often span private cloud and public cloud services. Engagements will include migration of existing applications and development of new applications using AWS cloud services.
- Insights - Work with AWS engineering and support teams to convey partner and customer needs and feedback as input to technology roadmaps. Share real world implementations and recommend new capabilities that would simplify adoption and drive greater value from use of AWS cloud services.
- Innovate - Engaging with the customer's business and technology stakeholders to create a compelling vision of a data-driven enterprise in their environment
Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have thirteen employee-led affinity groups, reaching 85,000 employees in over 190 chapters globally. We have innovative benefit offerings, and host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences. Amazon's culture of inclusion is reinforced within our 16 Leadership Principles, which remind team members to seek diverse perspectives, learn and be curious, and earn trust.
Our team puts a high value on work-life harmony. Striking a healthy balance between your personal and professional life is crucial to your happiness and success here. We are a customer-obsessed organizationleaders start with the customer and work backwards. They work vigorously to earn and keep customer trust. As such, this is a customer facing role in a hybrid delivery model. Project engagements include remote delivery methods and onsite engagement that will include travel to customer locations as needed.
Mentorship & Career Growth
Our team is dedicated to supporting new members. We have a broad mix of experience levels and tenures, and we're building an environment that celebrates knowledge sharing and mentorship. We care about your career growth and strive to assign projects based on what will help each team member develop into a better-rounded professional and enable them to take on more complex tasks in the future.
- Bachelor's degree, or equivalent experience, in Computer Science, Engineering, Mathematics or a related field
- 5+ years of experience of IT platform implementation in a technical and analytical role.
- 3+ years' experience of Data Lake/Hadoop platform implementation
- 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
- Experience Apache Hadoop and the Hadoop ecosystem
- Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro)
- Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto)
- Experience developing software code in one or more programming languages (Java, Python, etc)
- Masters or PhD in Computer Science, Physics, Engineering or Math
- Hands on experience leading large-scale global data warehousing and analytics projects
- Ability to think strategically about business, product, and technical challenges in an enterprise environment
- Ability to collaborate effectively across organizations
- Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, BI reporting and Dashboard development
- Demonstrated industry efficiency in the fields of database, data warehousing or data sciences
- Implementing AWS services in a variety of distributed computing, enterprise environments
- Customer facing skills to represent AWS well within the customer's environment and drive discussions with senior personnel regarding trade-offs, best practices, project management and risk mitigation
- Desire and ability to interact with different levels of the organization from development to C-Level executives