Responsibilities
The Prioritzation and Enforcement team is responsible for mitigating emerging platform risks through comprehensive containment strategies. This team works to quickly and thoroughly contain violative content, bridging the gap between immediate incident response and long-term policy or product solutions. They analyze trends, identify enforcement gaps, and implement strategic interventions to prevent the spread of harmful content. By working closely across Trust & Safety team teams, they help strengthen moderation and model systems for the most severe platform risks and improve platform safety.
Responsibilities:
- Conduct sweeps for violative content and trends across features to mitigate emerging risks
Want more jobs like this?
Get jobs in Dublin, Ireland delivered to your inbox every week.
- Identify patterns, trends, and gaps in content enforcement and implement mid-term mitigation solutions before passing insights to the Analysis & Prevention team for long-term recommendations
- Collaborate cross-functionally with policy, regional risk prevention, rapid response, product, and operations teams to implement containment strategies
- Partner with rapid response to contain high-risk content, ensuring timely and thorough mitigation of escalations and crises
- Develop and refine workflows for content containment, leveraging automation and manual review methods as needed
- Assess and report on the effectiveness of containment actions, iterating on strategies for continuous improvement
- Maintain documentation and insights sharing to support ongoing risk containment efforts
- Ensure adherence to content policies and regulatory requirements while balancing enforcement effectiveness
Qualifications
Minimum Qualifications
- 3-5+ years of experience in content moderation, trust & safety, risk management, or a related field along with fluency in German language.
- Strong analytical skills with the ability to identify patterns, trends, and enforcement gaps in content moderation.
- Experience conducting content sweeps and working with enforcement tools to investigate and enforce violative content.
- Demonstrated ability to work cross-functionally with policy, product, and operations teams.
- Excellent written and verbal communication skills, with the ability to present findings and recommendations clearly.
- Familiarity with content policies, regulatory requirements, and platform integrity challenges.
- Comfortable working in fast-paced environments with evolving risks and priorities.
Preferred Qualifications:
- Experience working in social media, tech platforms, or online content moderation at scale.
- Knowledge of automation tools, AI-driven content moderation, or data analysis techniques.
- Prior experience in incident management, crisis management, or risk mitigation strategies.
- Proficiency in SQL, Python, or other data analysis tools for investigating content patterns.