Are you passionate about simplifying complex problems? Do you like finding patterns and inventing new ways to push the boundaries of the current possibilities? Interested in building high-performance, globally scalable systems that support Amazon's current and future growth? If so, Amazon Finance Technology (FinTech) is for you!
Amazon FinTech is an organization where people, technology and innovation come together to build products and solve problems for Amazon. Technology solution and services we build enables Amazon's new business growth, provide operational efficiency through automation, compliance with law and analysis of our financial data. Through our products, we aim to provide Amazon effective advantage for running its business and insights for our customers using state of the art technologies.
FinTech is seeking a Data Engineer for the Indirect Tax Compliance and Audit team. Our team is responsible for automating complex business processes currently requiring hundreds of thousands of manual hours. We innovate in processing trillions of records in near real time and providing self-service capabilities to our customers to perform data transformation, computations and reconciliation. We build controls and auditing capabilities for our users to perform their tasks with a high degree of confidence and efficiency. The team is committed to building the next generation big data platform to support Amazon's rapidly growing and dynamic businesses, and use it to deliver Business Intelligence (BI) applications which will have an immediate influence on day-to-day decision making. Amazon has culture of data-driven decision-making, and demands data that is timely, accurate, and actionable. Our platform serves Amazon's tax functions across the globe.
As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud). You should be an expert in the design, creation, management, and business use of large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the existing platform. Excellent written and verbal communication skills are required as the person will work very closely with diverse teams. Having strong analytical skills is a plus. Above all, you should be passionate about working with huge datasets and someone who loves to bring datasets together to answer business questions and drive change.
Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional volumes and big data, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics. In this role you will be part of a team of engineers to create world's largest financial data warehouses and BI tools for Amazon's expanding global footprint.
• Design, implement, and support a platform providing secured access to large datasets
• Interface with Tax, Finance and Accounting customers, gathering requirements and delivering end-to-end BI solutions
• Model data and metadata to support ad-hoc and pre-built reporting
• Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions
• Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation
• Tune application and query performance using profiling tools and SQL
• Analyze and solve problems at their root, stepping back to understand the broader context
• Learn and understand a broad range of Amazon's data resources and know when, how, and which to use
• Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS services
• Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets
• Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us
• 3+ years of experience as a Data Engineer or in a similar role
• Experience with data modeling, data warehousing, and building ETL pipelines
• Experience in SQL
• Bachelor's Degree in Computer Science or equivalent
• 5+ years of work experience with ETL, Data Modeling, and Data Architecture
• 3+ years experience using big data technologies (Map Reduce, Hadoop, Hive, Spark, Presto, Parquet, EMR, etc.)
• Excellent knowledge of SQL and Linux OS
• Proficiency in at least one modern programming language such as Java, Scala, or Python
• Excellent understanding of Software Development Life Cycle (SDLC) and Agile software development with emphasis on Business Intelligence (BI) practices
• Master's Degree in Information Systems or equivalent
• Knowledge of data management fundamentals and data storage principles
• Knowledge of distributed systems as it pertains to data storage and distributed cloud computing
• Experience working with AWS Big Data Technologies (EMR, Redshift, S3)
• Experience with Business Intelligence solutions (using tools like Tableau, Business Objects, Cognos, etc.)
• Strong problem-solving skills and ability to prioritize conflicting requirements
• Excellent written and verbal communication skills and ability to succinctly summarize key findings
• Strong organizational and multitasking skills with ability to balance multiple priorities
• Experience providing technical leadership and mentoring for other engineers on data engineering best practices