Are you ready to pursue some of the hardest scalability, performance, and distributed computing challenges in AWS? Our team's vision is to be the world's authoritative provider of AWS computing insight, where customers can understand, control and optimize cost and usage of AWS products. Because we sit at the nexus of all AWS services and interact directly with end-customers, we build relationships with teams across AWS to ensure that we offer a secure and reliable customer experience. We take inputs from these services and merge them with the millions of events every second to produce actionable insight to our customers via web, mobile, tablet, and APIs.
Providing a scalable platform to support AWS's expanding business is a complex architectural challenge, and accurate cost and usage information is a critical piece. Enterprise-level customers make large dollar-value decisions based on the timeliness, accuracy, and detail of the data provided by our products.
Applied and Data Scientists on this team have end to end range and capabilities. They work directly with business owners to understand how they use data to drive their business. They design modeling frameworks to dive deep into these raw sources of information to get the most out of the data they have. They work directly with engineers to build automated pipelines and production scale information systems and models. They build automated tools which will allow their results to be shared with the business at scale. They align with business owners to continuously track their work to ensure maximum impact from their projects. They monitor performance of their work to evaluate whether improvements are needed after tracking has started in production.
AWS has the most services and more features within those services, than any other cloud providerfrom infrastructure technologies like compute, storage, and databasesto emerging technologies, such as machine learning and artificial intelligence, data lakes and analytics, and Internet of Things. Whether its Identity features such as access management and sign on, cryptography, console, builder & developer tools, and even projects like automating all of our contractual billing systems, AWS Platform is always innovating with the customer in mind. The AWS Platform team sustains over 750 million transactions per second.
Our team also puts a high value on work-life balance. Striking a healthy balance between your personal and professional life is crucial to your happiness and success here, which is why we aren't focused on how many hours you spend at work or online. Instead, we're happy to offer a flexible schedule so you can have a more productive and well-balanced lifeboth in and outside of work.
We have a formal mentor search application that lets you find a mentor that works best for you based on location, job family, job level etc. Your manager can also help you find a mentor or two, because two is better than one. In addition to formal mentors, we work and train together so that we are always learning from one another, and we celebrate and support the career progression of our team members.
Here at AWS, we embrace our differences. We are committed to furthering our culture of inclusion. We have ten employee-led affinity groups, reaching 40,000 employees in over 190 chapters globally. We have innovative benefit offerings, and we host annual and ongoing learning experiences, including our Conversations on Race and Ethnicity (CORE) and AmazeCon (gender diversity) conferences. Amazon's culture of inclusion is reinforced within our 14 Leadership Principles, which remind team members to seek diverse perspectives, learn and be curious, and earn trust.
Learn more about Amazon on our Day 1 Blog: https://blog.aboutamazon.com/"
- 3+ Years of experience in applied data science/analysis/engineering
- 1+ Years of experience applying Statistics/Data Science/Machine Learning
- 1+ Years of Scripting experience in Python/R or other scripting languages
- 1+ Years of SQL experience
- 1+ Years of experience in Data Visualization, using Tableau, R Shiny, other off the shelf products, or scripting directly
- Bachelor's Degree in Data Science, Computer Science, Information Systems, Data Analytics, or related scientific, technical, or engineering field
- Experience in ETL Management/Data Pipeline experience
- Experience as a leader/mentor of data analytics resources
- Proficient in Scala/Spark/Hadoop
- Experience documenting modelling for technical and business leaders
- Expert-level knowledge of SQL
- Working knowledge of AWS tech stack. Glue, Redshift, EMR, S3, EC2, Lambda will be used regularly in this role
- Experience working with data engineers/business intelligence engineers collaboratively
- PhD/Master's Degree in scientific, technical or engineering field