As an ETL / Hadoop developer, the candidate will design and develop Informatica ETL and Hadoop applications. Undertaking end-to-end project delivery (from inception to post-implementation support), including review and finalization of business requirements, creation of functional specifications and/or system designs, and ensuring that end-solution meets business needs and expectations. Development of new transformation processes to load data from source to target, or performance tuning of existing ETL code (mappings, sessions) and Hadoop Platform. Analysis of existing designs and interfaces and applying design modifications or enhancements Coding and documenting data processing scripts and stored procedures. Providing business insights and analysis findings for ad-hoc data requests Unit Testing software components and complete solutions (including debugging and troubleshooting) and preparing migration documentation. Providing reporting-line transparency through periodic updates on project or task status. Bachelor's / Master's degree in engineering, preferably Computer Science / Engineering 8+ years of experience with the technical analysis and design, development, and implementation of data warehousing / Data Lake solutions. 8+ years of Strong SQL programming and stored procedure development skills. 8+ years of experience developing in Informatica data management in the cloud or Informatica Power Center and DataStage. 6+ years in Python, Control M orchestration and Bit Bucket and Jenkins and CICD 6+ years in Hive/Impala/Spark, Unix, Python, 6+ years Python script development 6+ years relational database design in DB2, Hadoop, Snowflake and Teradata. 6+ years in Strong UNIX Shell scripting experience to support data warehousing solutions. 4+ years as an ETL developer in Snowflake Some experience in AWS Glue and S3 Process oriented, focused on standardization, streamlining, and implementation of best practices delivery approach. Excellent problem solving and analytical skills. Excellent verbal and written communication skills. Experience in optimizing large data loads. Should be a good collaborator Exposure to an Agile Development environment would be a plus. Knowledge about TWS Scheduler would an added advantage. Strong understanding of Data warehousing domain. Ability to architect an ETL solution and data conversion strategy. Good understanding of dimensional modelling Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren't just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There's also ample opportunity to move about the business for those who show passion and grit in their work. Consequently, our recruiting efforts reflect our desire to attract and retain the best and brightest from all talent pools. We want to be the first choice for prospective employees. It is the policy of the Firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, sex stereotype, gender, gender identity or expression, transgender, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy, veteran or military service status, genetic information, or any other characteristic protected by law.
Want more jobs like this?
Get Data and Analytics jobs delivered to your inbox every week.