Big Data Integrations Specialist - PVH Corp.
If you are a current PVH Associate, please click this link to apply through your Workday account.
Design Your Future at PVH
Big Data Integrations Specialist - PVH Corp.
This position, as a member of the Business Intelligence & Database team, will assist in the development of our Big Data Analytics Platform. The Big Data Integration specialist is responsible for designing, developing, and deploying data integration solutions on both RDMS and Big Data (Hadoop) platforms. The position will create and implement business intelligence solutions as well as extract, transform, and load (ETL) solutions using integration tools, programming, performance tuning, data modeling. The ideal candidate will possess skills for ingesting data into Big Data platform and prepares data for consumption and analysis (Hadoop, Map Reduce, Hive, HBase)
PRIMARY RESPONSIBILITIES/ACCOUNTABILITIES OF THE JOB:
- Learn area's direct flow; and how it affects surrounding systems and operational areas.
- Architect, design, construct, test, tune, deploy, and support Data Integration solutions for Hadoop (MapR) and MPP (Spark) solutions.
- Work closely with Business Intelligence team, Data Engineer and Data Scientist to achieve company business objectives.
- Collaborate with other technology teams and architects to define and develop solutions.
- Research and experiment with emerging Data Integration technologies and tools related to Big Data.
- Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed; ensuring a high degree of data quality.
- Assist Users/Analysts with the development in MapR and Spark
- Develop, write and implement processing requirements and post implementation review and performance tuning
- Facilitate and/or create new procedures and processes that support advancing technologies or capabilities
- Design & Implement ETL solutions utilizing Informatica Big Data Management (BDM)
- Create logic, system, and program flows for complex systems, including interfaces and metadata
- Write and execute unit test plans. Track and resolve any processing issues.
- Implement and maintain operational and disaster-recovery procedures.
- Participate in the review of code and/or systems for proper design standards, content and functionality.
- Participate in all aspects of the Systems Development Life Cycle
- Analyze files internal, external and 3rd party systems and map data from one system to another
- Adhere to established source control versioning policies and procedures
- Meet timeliness and accuracy goals.
- Communicate status of work assignments to stakeholders and management.
- Responsible for technical and production support documentation in accordance with department standards and industry best practices.
- Maintain current knowledge on new developments in technology-related industries
- Participate in corporate quality and data governance programs
QUALIFICATIONS & EXPERIENCE:
- 6+ years of experience building and managing complex Big Data Integration solutions in Cloud
- 6+ years of experience with distributed, highly-scalable, multi-node environments.
- MS SQL Certification or other certification in current programming languages a plus
- Bachelor's Degree in Information Technology or related field preferred
Required Professional Competencies:
- Advanced knowledge of business intelligence, programming, and data analysis software
- Intermediate knowledge of Microsoft SQL databases.
- Intermediate proficiency in T-SQL, NZ-SQL, PostgreSQL, data tuning, enterprise data modeling and schema change management.
- Ingestion of data into Hadoop and proficiency with the usage of common Hadoop Tools; such as, NIFI, Hive, Pig, Oozie, HBase, Flume, Sqoop, Yarn MapReduce, Ambari, Spark, Java, Python,
- Proficiency in Bid Data Integration tools like Informatica Big Data Management and/or Talend
- Strong object-oriented design and analysis skills
- Experience consuming, organizing and analyzing JSON and XML messages as data.
Preferred Job Skills
- Advanced knowledge of Data Management including Data Integration and Data Quality
- Advanced proficiency in Informatica Big Data Management, Talend Open Studio tools.
- Experience in Informatica Big Data Quality, Big Data Masking, Enterprise Information Catalog a plus
- Intermediate knowledge in Python and/or R scripting
- Flair for data, schema, data model, how to bring efficiency in big data related life cycle
- Minimum 1-2 Year Experience on Cloud computing, Google preferable.
- Proficiency with agile development practices
- Experience collecting and storing data from Restful API's
It is the policy of PVH Corp. to ensure equal employment opportunities to all qualified persons without regard to race, gender, religion, age, national origin, citizenship status, disability, qualified veteran status, marital status, or sexual orientation.
Continue exploring our current job opportunities and take the next step towards designing your future.
Back to top