Sr. Manager - EDW

Associate Job Description         

Position Description:

Global Custom Commerce is looking for a Senior Manager of EDW. The Senior Manager of EDW is a hands-on leader equally comfortable working side-by-side with team members to deliver solutions and discussing strategy with executive leadership. This person provides thought leadership around the EDW, data architecture and integration, and will guide the team as we move toward new and emerging data stores to support real-time analytics and machine learning.

Why work here?

Our entrepreneurial roots and maverick mentality, coupled with the resources and backing of the #1 home improvement retailer in the world, The Home Depot, is a unique opportunity for you to be a transformative retail disrupter. Plus, GCC is the world’s largest online window covering company, and we’ve got a demonstrably awesome 20-year track record. From our open-floor office to our open-door ethos, our culture is rooted in improving, evolving, and having fun (we’re pretty serious about cake, cook-offs, ping pong, meaningful work and exciting projects). Most importantly, our team members are always inspired, engaged, and ready for growth.

That means you’ll have the resources and the runway to create truly magical, out-of-the-box work. Moreover, you will play an important role in leveraging our culture, people, systems, processes, and technology — ultimately to provide incredible customer and associate experiences, while growing business for GCC and The Home Depot. This is your chance to be part of something big, in a small start-up environment.

We’re ranked as one of The Top 5 Workplaces in Texas and have consistently won the following awards: The Best Place to Work in Houston (Houston Business Journal), Houston’s Top Workplaces (Houston Chronicle) and Houston’s Best and Brightest.

Our Core Values


These aren’t just buried somewhere in an employee manual. We live and breathe them. They’re on the walls and live in our hearts. They come up constantly in conversations and actions. They govern the decisions of the newest hire all the way up to our CEO. We started with four core values crafted in GCC and we embrace eight additional core values from The Home Depot that we live every day:


Page 1 | 3


  1.    Improve continuously
  2.    Experiment without fear
  3.    Be yourself and speak up
  4.    Enjoy the ride
  5.    Entrepreneurial Spirit
  6.    Take care of our people
  7.    Respect for all people
  8.    Doing the right thing
  9.    Building strong relationships
  10. Giving back
  11. Excellent customer service
  12. Creating shareholder value

Page 1 | 3


Duties and Responsibilities:
  • Manage a small team of EDW engineers and work hands on with your team to deliver solutions
  • Implement a real time streaming data ingestion and processing pipeline using Google Dataflow (Apache Beam) or related technology
  • Interface with business intelligence analysts and others in IT (i.e. data engineers, architects, WebOps) in frequent whiteboard sessions to discuss the design, implementation, and testing of data pipleines
  • Maintain data architecture standards and ETL/ELT best practices consistent with a column oriented data store in an analytic use case
  • Architect, plan, and implement a highly available, scalable, and adaptable end-to-end data environment that addresses longer-term business needs.
  • Design, development and re-engineering of process to improve performance results, organization effectiveness and/or systems/quality/services.
  • Manage a small team of 1-3 engineers augmented by contractors when needed. Guide and mentor team members
  • Keeps company leaders informed of important developments, potential problems, and related information necessary for effective management. Coordinates and communicates plans and activities with others to ensure a coordinated work effort and team approach.
  • Experience in building real time streaming data ingestion and processing pipeline using Apache Beam (running on either Google Datflow or Apache (Apex, Flink, or Spark) or Kafka in an analytics or data science use case
  • Experience with data processing tools (e.g. Hadoop, Spark, Dataflow, etc.)
  • Experience building ETL/ELT pipelines
  • Experience with column-oriented databases (e.g Redhift, BigQuery, Vertica)
  • Ability to go from whiteboard discussion to code
  • Ability to effectively communicate with technical and non-technical audiences
  • Strong programming ability
  • Success in a highly dynamic environment and ability to shift priorities with agility
  • Ability to act independently with minimal supervision
  • Willingness to explore and implement new ideas and technologies
  • 6+ more years of experience working directly with subject matter experts in both business and technology domains
  • 6+ years of experience with a modern programming language (java or python preferred)
  • 2+ years of experience with Apache Beam executed on Apex, Flink, Spark, or Google Dataflow
  • 2+ years of experience leading or managing small team
  • Master’s Degree in Computer Science, Management Information systems, Statistics, related field or equivalent work experience.
  • Strong knowledge of end-to-end data warehouse development life cycle (data integration, logical and physical modeling, and data delivery) supporting enterprise analytics and BI solutions.
  • Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping tools, and data profiling tools.
  • Ability to manage data and metadata migration. Understanding of Web services (SOAP, XML, REST, UDDI, WSDL) and integrating with our existing data environment.
Working Conditions:

General office environment with primary responsibilities being clerical/administrative.  The working environment is generally favorable. Lighting and temperature are adequate with no hazardous or unpleasant conditions caused by noise, dust, etc.  Work is performed in the office environment, with standard office equipment.

Page 1 | 3


Back to top