Data Scientist, Infrastructure
Facebook's mission is to give people the power to build community and bring the world closer together. Through our family of apps and services, we're building a different kind of company that connects billions of people around the world, gives them ways to share what matters most to them, and helps bring people closer together. Whether we're creating new products or helping a small business expand its reach, people at Facebook are builders at heart. Our global teams are constantly iterating, solving problems, and working together to empower people around the world to build community and connect in meaningful ways. Together, we can help people build stronger communities â€" we're just getting started.
The Infrastructure Strategy group is responsible for the strategic analysis to support and enable the continued growth critical to Facebookâ€™s infrastructure organization. The ideal candidate will be passionate about Facebook, have strong analytical and modeling aptitude and has experience using data to drive cost effective decision making.
- Leverage data and business principles to solve large-scale web, mobile and data infrastructure problems.
- Work cross-functionally to define problem statements, collect data, build analytical models and make recommendations.
- Build and maintain data driven optimization models, experiments, forecasting algorithms, and machine learning models.
- Leverage tools like Python, R, Hadoop & SQL to drive efficient analytics.
- Communicate final recommendations and drive decision making.
- Degree in quantitative field (e.g. Computer Science, Engineering, Mathematics, Statistics, Operations Research or other related field)
- 2+ years of industry or graduate research experience solving analytical problems and building models using quantitative, statistical or machine learning approaches
- Experience with Machine Learning, Statistics, or other data analysis tools and techniques
- Experience performing data extraction, cleaning, analysis and presentation for medium to large datasets
- Experience with at least one programming language (i.e. Python, R, Java, or C++)
- Experience writing SQL queries
- Experience with scientific computing and analysis packages such as NumPy, SciPy, Pandas, Scikit-learn, dplyr, or ggplot2
- Experience with statistics methods such as forecasting, time series, hypothesis testing, classification, clustering or regression analysis
- Experience with data visualization libraries such as Matplotlib, Pyplot, ggplot2
- Experience with machine learning libraries and packages such as PyTorch, Caffe2, TensorFlow, Keras or Theano
- Advanced degree (Masterâ€™s or PhD) in quantitative field
- Experience working with distributed computing tools (Hadoop, Hive, Spark, etc.)
- Proficiency in algorithmic complexity
Back to top