Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Policy Analyst, Search - Trust & Safety

5 days ago Dublin, Ireland

Responsibilities

TikTok is the leading destination for short-form video. Our mission is to inspire creativity and bring joy.
Our Trust & Safety team's commitment is to keep our online community safe. We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community.
As Policy Analyst - Search, in our Trust & Safety team, you will focus on improving content moderation accuracy, conducting deep dives into policy operability and enforcement, supporting content labeling initiatives, analyzing moderation quality metrics, and reviewing user feedback to enhance policy effectiveness. You will collaborate cross-functionally with policy, operations, policy training, and T&S product teams to refine enforcement strategies and ensure a safer platform for users.
This role may involve limited exposure to harmful or distressing content, which includes but is not limited to: bullying; hate speech; child abuse; sexual assault; torture; bestiality; self-harm; suicide; or murder.

Want more jobs like this?

Get jobs in Dublin, Ireland delivered to your inbox every week.

Job alert subscription


What will I be doing?
- Moderation Accuracy & Policy Enforcement: Analyze and assess content moderation decisions to identify accuracy gaps and provide recommendations for improvement. Support the development and refinement of enforcement guidelines to ensure clear, consistent, and fair application of policies. Work closely with operations and data teams to monitor and improve human and AI-based moderation quality. Collaborate with training and development teams to provide vetted cases for use in instructional materials. Cultivate a deep understanding of Search features, and enforcement requirements to support the development, launch and training of effective UGC content policies, guidance and safety strategies.
- Content Review, Metrics Analysis & Quality Assurance: Work closely with moderation teams to provide policy clarifications and training support. Analyze key moderation quality metrics. Assess user feedback trends related to moderation decisions, identifying areas where enforcement may need adjustment or clarification.Help develop reports and insights on policy effectiveness, enforcement trends, and areas for improvement.
- Content Labeling & Categorization: Support the development and implementation of content labeling strategies to improve content classification and policy enforcement. Work with data teams to analyze content trends and ensure labels align with evolving policies and enforcement needs. Contribute to training datasets for AI-driven moderation tools and ensure policy intent is accurately reflected.
- Escalation Handling: Lead immediate mitigation measures, such as search query takedowns, feature restrictions, or content takedowns, following policies and escalation playbooks. Ensure adherence to content policies and regulatory requirements while balancing enforcement effectiveness. Partner with cross-functional teams (e.g., Policy, Engineering, Legal) by providing data and context as directed. Assess and report on the effectiveness of containment actions, iterating on strategies for continuous improvement.

Qualifications

Minimum Qualifications:
- 2+ years of work experience in Trust & Safety, product policy, or other product safety roles in a new media, technology, or entertainment company
- Strong analytical skills with the ability to conduct deep dives into policy enforcement data, moderation quality metrics, and user feedback trends
- Team player and ability to collaborate with different teams
- Excellent communication skills with the ability to clearly articulate policy nuances to diverse stakeholders.
- Excellent time management and great problem solving skills
- Ability to work in a high tempo environment, adapt, respond to day-to-day challenges of the role
- High flexibility regarding working hours and days

Preferred Qualifications:
- 5+ years of work experience in Trust & Safety, product policy, or other product safety roles in a new media, technology, or entertainment company
- Proven ability to develop sound research methodologies and collect, synthesize, analyze, and interpret data
- Experience working in a start-up, or forming new teams in established companies
- Experience handling content-related escalations and assessing complex moderation decisions
- Experience with processes such as appeals management, final arbitration, data analysis etc
- Experience with escalation or crisis mitigation

Client-provided location(s): Dublin, Ireland
Job ID: TikTok-7548391912404650258
Employment Type: OTHER
Posted: 2025-09-10T20:23:14

Perks and Benefits

  • Health and Wellness

    • Health Insurance
    • Dental Insurance
    • Vision Insurance
    • HSA
    • Life Insurance
    • Fitness Subsidies
    • Short-Term Disability
    • Long-Term Disability
    • On-Site Gym
    • Mental Health Benefits
    • Virtual Fitness Classes
  • Parental Benefits

    • Fertility Benefits
    • Adoption Assistance Program
    • Family Support Resources
  • Work Flexibility

    • Flexible Work Hours
    • Hybrid Work Opportunities
  • Office Life and Perks

    • Casual Dress
    • Snacks
    • Pet-friendly Office
    • Happy Hours
    • Some Meals Provided
    • Company Outings
    • On-Site Cafeteria
    • Holiday Events
  • Vacation and Time Off

    • Paid Vacation
    • Paid Holidays
    • Personal/Sick Days
    • Leave of Absence
  • Financial and Retirement

    • 401(K) With Company Matching
    • Performance Bonus
    • Company Equity
  • Professional Development

    • Promote From Within
    • Access to Online Courses
    • Leadership Training Program
    • Associate or Rotational Training Program
    • Mentor Program
  • Diversity and Inclusion

    • Diversity, Equity, and Inclusion Program
    • Employee Resource Groups (ERG)

Company Videos

Hear directly from employees about what it is like to work at TikTok.