CTPS Finance Analytics is looking for a Data Engineer (DE) to play a significant role in building their large-scale, high-volume, high-performance data integration and delivery services. These data solutions would be primarily used in periodic reporting, and drive business decision making while dealing efficiently with the massive scale of data available through our Data Warehouse as well as our software systems. You will be responsible for designing and implementing solutions using third-party and in-house reporting tools, modelling metadata, building reports and dashboards, and administering the platform software. You are expected to build efficient, flexible, extensible, and scalable data models, ETL designs and data integration services. You are required to support and manage growth of these data solutions.
You must be a self-starter and be able to learn on the go. Excellent written and verbal communication skills are required as you will work very closely with diverse teams.
As a Data Engineer in CTPS Finance Analytics team, you will be working in one of the world's largest cloud-based data lakes. You should be skilled in the architecture of data warehouse solutions for the Enterprise using multiple platforms (EMR, RDBMS, Columnar, Cloud). You should have extensive experience in the design, creation, management, and business use of extremely large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. Above all you should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive change.
• 3+ years of experience as a Data Engineer or in a similar role
• Experience with data modeling, data warehousing, and building ETL pipelines
• Experience in SQL
• Bachelors Degree in an engineering or technical field
• Experience in relational database concepts with a solid knowledge of SQL.
• Strong knowledge of various data warehousing methodologies and data modelling concepts. Hands on modelling experience is highly desired
• Experience performing various performance tuning activities at the both database level as well as ETL
• Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build processes, testing, and operations
• Coding proficiency in at least one modern programming language (Python, Ruby, Java, etc)
• 5+ years experience with and detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
• Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist) with a track record of manipulating, processing, and extracting value from large datasets.
• Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets
• Experience building data products incrementally and integrating and managing datasets from multiple sources
• Query performance tuning skills using Unix profiling tools and SQL
• Experience leading large-scale data warehousing and analytics projects, including using AWS technologies Redshift, S3, EC2, Data-pipeline and other big data technologies
• Experience providing technical leadership and mentor other engineers for the best practices on the data engineering space
• Linux/UNIX including to process large data sets.
• Experience with AWS
Amazon is an Equal Opportunity-Affirmative Action Employer Minority / Female / Disability / Veteran / Gender Identity / Sexual Orientation.