Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Sr. Data Scientist, AI Delivery/Deployment

Yesterday Flexible / Remote
Hi, We’re AppFolio

 

We’re innovators, changemakers, and collaborators. We’re more than just a software company – we’re pioneers in cloud and AI who deliver magical experiences that make our customers’ lives easier. We’re revolutionizing how people do business in the real estate industry, and we want your ideas, enthusiasm, and passion to help us keep innovating.

 

We believe ML and AI are powerful tools—but not universal solutions. This role exists to ensure we deploy AI deliberately, responsibly, and only where it meaningfully improves internal outcomes. Success of this role is measured by the adoption of the tools built - production ML/AI systems that drive measurable business outcomes (eg, cost reduction, efficiency gains). We’re aiming for adoption, data democratization, and business enablement. 

 

You will determine the most effective path forward—whether that involves leveraging existing AI Factory infrastructure or building bespoke systems from scratch when business logic requires it. You will advocate for "AI where it adds value," ensuring we avoid high-cost, low-impact complexity in favor of robust, scalable solutions that move the needle for our stakeholders.

 

Your impact 
  • Deploy End-to-End ML/AI: Lead the design and delivery of custom ML/AI workflows, ranging from classical regression and classification models to agentic LLM systems. 
  • Strategic Partnership: Navigate ambiguity by partnering with product, business, and engineering stakeholders to translate complex business challenges into concrete ML/AI roadmaps. 
  • Drive ML Explainability & Narrative Insights: Translate complex "black-box" model predictions and feature importance into human-readable narratives. You will build the translation layer that makes sophisticated data science insights accessible and actionable for executive stakeholders.
  • Operationalize Value-Focused Observability: Design and implement observability frameworks to track the full lifecycle of deployments, monitoring business ROI and model health
  • Standardize Infrastructure & Engineering: Establish reusable code libraries, modeling frameworks, and orchestration standards to accelerate model delivery
Qualifications 
  • Technical Prowess – You possess the mathematical depth to build sophisticated models and the software engineering rigor to deploy them. 

Want more jobs like this?

Get jobs in Flexible / Remote delivered to your inbox every week.

Job alert subscription
Deployment Mindset – You don't consider a project "done" when the notebook is finished; you thrive on the challenge of getting models into the hands of users and keeping them running at scale.
  • Business Acumen – You understand key challenges facing our business and partner with stakeholders to find creative ways to apply AI to solve them.
  • Communication & Storytelling – You can translate complex technical concepts into compelling narratives for non-technical stakeholders. You are comfortable presenting to leadership, justifying AI investments with ROI, and setting the vision of AI initiatives across the company.
  • Efficiency – Able to quickly iterate on data generation and refinement. Looks for ways to improve processes to maximize efficiency and remove redundancy.
  •  

    Must have
    • Custom LLM Workflow Experience: Proven track record of customizing and deploying LLM workflows for specific, non-generic use cases.
    • Classical ML Mastery: Advanced knowledge of classical machine learning techniques such as regression and classification
    • Bachelor’s Degree in a STEM field and minimum 6 years of experience in a related field.
    • Cloud Services Expertise: Demonstrated experience working with and deploying models within major cloud environments (AWS, Azure, GCP, or Snowflake).
    • Data Translation: Deep understanding of Feature Importance (SHAP/LIME) and how to map these values to semantic context.
    • Strong programming background (Python, SQL, version control, system design) with experience writing production-grade, modular code.
    • Effective Communication: Strong listening and interpersonal skills; ability to communicate with cross-functional partners in both technical and business terms.
    Nice to Have
    • Snowflake Proficiency: Hands-on experience navigating the broader Snowflake Data Cloud ecosystem (Snowpark, Cortex, Streamlit)
    • Vector Database Experience: Familiarity with vector search architectures or managed vector search offerings (e.g., Pinecone, Weaviate, Cortex Search).
    • Model Context Protocol (MCP): Familiarity with MCP for creating standardized, interoperable connections between LLMs and data sources or proprietary tools.
    • Experience building AI Agents for automated code-refactoring or SQL generation.
    • Experience using dbt (data build tool) or orchestration frameworks such as Airflow.
    Compensation & Benefits
    The base salary that we reasonably expect to pay for this role is: $138,300-$173,000
    The actual base salary for this role will be determined by a variety of factors, including but not limited to: the candidate’s skills, education, experience, etc. 
    Please note that base pay is one important aspect of a compelling Total Rewards package. The base pay range indicated here does not include any additional benefits or bonuses/commissions that you may be eligible for based on your role and/or employment type.

    Regular full-time employees are eligible for benefits - see here.

    #LI-KB1

    Client-provided location(s): Flexible / Remote
    Job ID: oM4zzfwI-CwbKYfw9
    Employment Type: OTHER
    Posted: 2026-02-06T23:31:16

    Perks and Benefits

    • Health and Wellness

      • Parental Benefits

        • Work Flexibility

          • Office Life and Perks

            • Vacation and Time Off

              • Financial and Retirement

                • Professional Development

                  • Diversity and Inclusion