Data Warehouse Engineer
We're is looking for a Data Warehouse Engineer with experience in cloud infrastructure, particularly AWS and Google Cloud Platform, to help us evolve our data-driven philosophy and become a world-class data organization. You will have to be self-sufficient - we operate like a startup, so everyone might do a bit of everything to get things done. We are looking for people who want to excel at everything they do, while still being able to prioritize between the must-haves and nice-to-haves. We want to obtain a data advantage by leveraging our data assets and this role will design the foundation for which our advantage is constructed.
What you’ll do:
- Create ETL pipelines to deliver sanctioned data to stakeholders, while maintaining high accuracy and reliability.
- Performance tune and monitor data infrastructure to support a growing organization.
- Brainstorm data product ideas, and work closely with Data Scientists, Product Management and Operations team to develop, test, deploy, and operate high-quality software.
- Develop data infrastructure that ingest and transforms data from different sources and customers at scale.
- Partner end-to-end with Business Managers, Product Managers and Data Scientists to understand customer requirements and design prototypes and bring ideas to production
- Work with internal business leaders to ingest data to enrich their data modeling and work products.
- Participate in conversations with your business teams about business impacting topics and brainstorm innovative ways to transform data into information and knowledge that drives revenue and reduces cost.
What we're looking for:
- 5+ years data warehousing or data engineering experience with a distinguished track record on technically demanding projects
- Deep knowledge of SQL databases (preferably PostgreSQL)
- Comfort working with cloud-managed data warehouse technologies (Amazon Redshift, Google BigQuery, Snowflake)
- Strong experience working with Python, particularly for ETL or Data Science related tasks.
- Experience working in a data lake architecture, separating compute from storage.
- Passion for creating new products and services, including being comfortable with the ambiguity associated with designing new products.
- Experience working with REST APIs to ingest and enrich data sets.
- BS or MS in Computer Science or equivalent
Nice to have:
- Experience with Apache Airflow for workflow management.
- Comfort using Hadoop related technologies(Spark, Hive, Presto, etc.)
- Data Science/Machine Learning background.
- Familiarity with construction industry.
Procore Technologies is building the software that builds the world. We provide cloud-based construction management software that helps clients more efficiently build skyscrapers, hospitals, housing complexes, and more. Our headquarters is located on the bluffs above the Pacific Ocean in Carpinteria, CA, with growing offices worldwide. Check us out on Glassdoor to see what others are saying about working at Procore!
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Perks & Benefits
You are a person with dreams, goals, and ambitions—both personally and professionally. That's why we believe in providing benefits that not only match our Procore values (Openness, Optimism, and Ownership) but enhance the lives of our team members. Here are just a few of our benefit offerings: competitive health care plans, unlimited paid vacation, employee enrichment and development programs, and volunteer days.
Meet Some of Procore Technologies's Employees
Customer Success Manager
Tobi assists a variety of clients—from company owners and engineers to general contractors and architects—in successfully using Procore to increase their efficiency, productivity, and profitability.
Back to top