Facebook's mission is to give people the power to build community and bring the world closer together. Through our family of apps and services, we're building a different kind of company that connects billions of people around the world, gives them ways to share what matters most to them, and helps bring people closer together. Whether we're creating new products or helping a small business expand its reach, people at Facebook are builders at heart. Our global teams are constantly iterating, solving problems, and working together to empower people around the world to build community and connect in meaningful ways. Together, we can help people build stronger communities - we're just getting started.
At Facebook, we have many opportunities to work with data each and every day. In this role as a Data Engineer on the Analytics team, your primary responsibility will be to partner with key stakeholders, data scientists and software engineers to support and enable the continued growth critical to Facebook's Data Center organization. You will be responsible for creating the technology that moves and translates data used to inform our most critical strategic and real-time decisions. You will also help translate business needs into requirements and identify efficiency opportunities. In addition to extracting and transforming data, you will be expected to use your expertise and provide meaningful recommendations and actionable strategies to partnering data scientist for performance enhancements and development of best practices, including streamlining of data sources and related programmatic initiatives. The ideal candidate will have a passion for working in white space and creating impact from the ground up in a fast-paced environment. This position is part of the Infrastructure Data Center team and located in our Menlo Park office.
- Apply proven expertise and build high-performance scalable data warehouses
- Design, build and launch efficient & reliable data pipelines to move and transform data (both large and small amounts)
- Securely source external data from numerous partners
- Intelligently design data models for optimal storage and retrieval
- Deploy inclusive data quality checks to ensure high quality of data
- Optimize existing pipelines and maintain of all domain-related data pipelines
- Ownership of the end-to-end data engineering component of the solution
- Collaboration with the Data Center SMEs, Data Scientists, and Program Managers
- Support on-call shift as needed to support the team
- Design and develop new systems in partnership with software engineers to enable quick and easy consumption of data
- BS/MS in Computer Science or a related technical field
- 7+ years of SQL (Oracle, Vertica, Hive, etc.) experience and relational databases experience (Oracle, MySQL)
- 7+ years of experience in custom or structured (i.e. Informatica/Talend/Pentaho) ETL design, implementation and maintenance
- 7+ years' experience in data engineering, experience in applying DWH/ETL best practices
- 7+ years of Java and/or Python development experience
- 2+ years experience in LAMP and the Big Data stack environments (Hadoop, MapReduce, Hive)
- 2+ years experience working with enterprise DE tools and experience learning in-house DE tools
- Technical knowledge of data center operations.
Back to top