Are you passionate about data? Does the prospect of dealing with massive volumes of data excite you? Do you want to build data engineering solutions that process billions of records a day in a scalable fashion using AWS technologies? Do you want to create the next-generation tools for intuitive data access? If so, Amazon Finance Technology (FinTech) is for you!
FinTech is seeking a Data Engineer to join the team that is shaping the future of the finance data platform. The team is committed to building the next generation big data platform that will be one of the world's largest finance data warehouse to support Amazon's rapidly growing and dynamic businesses, and use it to deliver the BI applications which will have an immediate influence on day-to-day decision making. Amazon has culture of data-driven decision-making, and demands data that is timely, accurate, and actionable. Our platform serves Amazon's finance, tax and accounting functions across the globe.
As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud). You should be an expert in the design, creation, management, and business use of large data-sets. The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the existing platform. Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional volumes and big data, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics.
• Design, implement, and support a platform providing secured access to large datasets.
• Interface with tax, finance and accounting customers, gathering requirements and delivering complete BI solutions.
• Architect and develop end to end scalable data applications and data pipelines
• Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
• Develop ETL pipeline that links various datasets that could serve the analysis objectives in the most efficient way
• Analyze and solve problems at their root, stepping back to understand the broader context.
• Learn and understand a broad range of Amazon's data resources and know when, how, and which to use and which not to use.
• Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
• 5+ years of experience as a Data Engineer or in a similar role
• Experience with data modeling, data warehousing, and building ETL pipelines
• Experience in SQL
• Experience with building data pipelines and applications to stream and process datasets at low latencies.
• Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines, knows how to optimize the distribution, partitioning, and MPP of high-level data structures.
• Knowledge of Engineering and Operational Excellence using standard methodologies.
• Master's degree in Information Systems or a related field.
• Expertise in designing systems and workflows for handling Big data volumes
• Knowledge of data management fundamentals and data storage principles
• Strong problem-solving skills and ability to prioritize conflicting requirements.
• Excellent written and verbal communication skills and ability to succinctly summarize key findings.
• Experience working with AWS Big Data Technologies (EMR, Redshift, S3)
• Strong organizational and multitasking skills with ability to balance multiple priorities.
• Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space.
• An ability to work in a fast-paced environment where continuous innovation is occurring and ambiguity is the norm.